32134 1727204424.98738: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-twx executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 32134 1727204424.99847: Added group all to inventory 32134 1727204424.99850: Added group ungrouped to inventory 32134 1727204424.99855: Group all now contains ungrouped 32134 1727204424.99859: Examining possible inventory source: /tmp/network-6Zh/inventory-Sfc.yml 32134 1727204425.26374: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 32134 1727204425.26461: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 32134 1727204425.26491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 32134 1727204425.26574: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 32134 1727204425.26678: Loaded config def from plugin (inventory/script) 32134 1727204425.26681: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 32134 1727204425.26737: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 32134 1727204425.26860: Loaded config def from plugin (inventory/yaml) 32134 1727204425.26862: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 32134 1727204425.26983: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 32134 1727204425.27582: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 32134 1727204425.27586: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 32134 1727204425.27592: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 32134 1727204425.27600: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 32134 1727204425.27605: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 32134 1727204425.27706: /tmp/network-6Zh/inventory-Sfc.yml was not parsable by auto 32134 1727204425.27798: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 32134 1727204425.27849: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 32134 1727204425.27971: group all already in inventory 32134 1727204425.27979: set inventory_file for managed-node1 32134 1727204425.27984: set inventory_dir for managed-node1 32134 1727204425.27985: Added host managed-node1 to inventory 32134 1727204425.27988: Added host managed-node1 to group all 32134 1727204425.27991: set ansible_host for managed-node1 32134 1727204425.27992: set ansible_ssh_extra_args for managed-node1 32134 1727204425.27996: set inventory_file for managed-node2 32134 1727204425.27999: set inventory_dir for managed-node2 32134 1727204425.28000: Added host managed-node2 to inventory 32134 1727204425.28002: Added host managed-node2 to group all 32134 1727204425.28003: set ansible_host for managed-node2 32134 1727204425.28005: set ansible_ssh_extra_args for managed-node2 32134 1727204425.28008: set inventory_file for managed-node3 32134 1727204425.28010: set inventory_dir for managed-node3 32134 1727204425.28012: Added host managed-node3 to inventory 32134 1727204425.28015: Added host managed-node3 to group all 32134 1727204425.28017: set ansible_host for managed-node3 32134 1727204425.28018: set ansible_ssh_extra_args for managed-node3 32134 1727204425.28021: Reconcile groups and hosts in inventory. 32134 1727204425.28026: Group ungrouped now contains managed-node1 32134 1727204425.28028: Group ungrouped now contains managed-node2 32134 1727204425.28030: Group ungrouped now contains managed-node3 32134 1727204425.28132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 32134 1727204425.28306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 32134 1727204425.28373: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 32134 1727204425.28419: Loaded config def from plugin (vars/host_group_vars) 32134 1727204425.28422: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 32134 1727204425.28430: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 32134 1727204425.28440: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 32134 1727204425.28495: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 32134 1727204425.28917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204425.29041: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 32134 1727204425.29103: Loaded config def from plugin (connection/local) 32134 1727204425.29107: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 32134 1727204425.30069: Loaded config def from plugin (connection/paramiko_ssh) 32134 1727204425.30073: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 32134 1727204425.31346: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 32134 1727204425.31408: Loaded config def from plugin (connection/psrp) 32134 1727204425.31412: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 32134 1727204425.32516: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 32134 1727204425.32569: Loaded config def from plugin (connection/ssh) 32134 1727204425.32572: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 32134 1727204425.35241: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 32134 1727204425.35298: Loaded config def from plugin (connection/winrm) 32134 1727204425.35301: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 32134 1727204425.35348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 32134 1727204425.35427: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 32134 1727204425.35531: Loaded config def from plugin (shell/cmd) 32134 1727204425.35534: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 32134 1727204425.35574: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 32134 1727204425.35680: Loaded config def from plugin (shell/powershell) 32134 1727204425.35683: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 32134 1727204425.35752: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 32134 1727204425.36037: Loaded config def from plugin (shell/sh) 32134 1727204425.36040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 32134 1727204425.36081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 32134 1727204425.36274: Loaded config def from plugin (become/runas) 32134 1727204425.36277: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 32134 1727204425.36570: Loaded config def from plugin (become/su) 32134 1727204425.36573: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 32134 1727204425.36823: Loaded config def from plugin (become/sudo) 32134 1727204425.36826: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 32134 1727204425.36875: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml 32134 1727204425.37330: in VariableManager get_vars() 32134 1727204425.37356: done with get_vars() 32134 1727204425.37534: trying /usr/local/lib/python3.12/site-packages/ansible/modules 32134 1727204425.41419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 32134 1727204425.41586: in VariableManager get_vars() 32134 1727204425.41594: done with get_vars() 32134 1727204425.41598: variable 'playbook_dir' from source: magic vars 32134 1727204425.41599: variable 'ansible_playbook_python' from source: magic vars 32134 1727204425.41600: variable 'ansible_config_file' from source: magic vars 32134 1727204425.41601: variable 'groups' from source: magic vars 32134 1727204425.41602: variable 'omit' from source: magic vars 32134 1727204425.41603: variable 'ansible_version' from source: magic vars 32134 1727204425.41604: variable 'ansible_check_mode' from source: magic vars 32134 1727204425.41605: variable 'ansible_diff_mode' from source: magic vars 32134 1727204425.41606: variable 'ansible_forks' from source: magic vars 32134 1727204425.41607: variable 'ansible_inventory_sources' from source: magic vars 32134 1727204425.41608: variable 'ansible_skip_tags' from source: magic vars 32134 1727204425.41609: variable 'ansible_limit' from source: magic vars 32134 1727204425.41610: variable 'ansible_run_tags' from source: magic vars 32134 1727204425.41611: variable 'ansible_verbosity' from source: magic vars 32134 1727204425.41658: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml 32134 1727204425.42451: in VariableManager get_vars() 32134 1727204425.42472: done with get_vars() 32134 1727204425.42526: in VariableManager get_vars() 32134 1727204425.42561: done with get_vars() 32134 1727204425.42610: in VariableManager get_vars() 32134 1727204425.42628: done with get_vars() 32134 1727204425.42800: in VariableManager get_vars() 32134 1727204425.42821: done with get_vars() 32134 1727204425.42827: variable 'omit' from source: magic vars 32134 1727204425.42850: variable 'omit' from source: magic vars 32134 1727204425.42897: in VariableManager get_vars() 32134 1727204425.42920: done with get_vars() 32134 1727204425.42978: in VariableManager get_vars() 32134 1727204425.42996: done with get_vars() 32134 1727204425.43051: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 32134 1727204425.43380: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 32134 1727204425.43576: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 32134 1727204425.44610: in VariableManager get_vars() 32134 1727204425.44638: done with get_vars() 32134 1727204425.45249: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 32134 1727204425.45459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 32134 1727204425.48375: in VariableManager get_vars() 32134 1727204425.48379: done with get_vars() 32134 1727204425.48382: variable 'playbook_dir' from source: magic vars 32134 1727204425.48384: variable 'ansible_playbook_python' from source: magic vars 32134 1727204425.48385: variable 'ansible_config_file' from source: magic vars 32134 1727204425.48386: variable 'groups' from source: magic vars 32134 1727204425.48387: variable 'omit' from source: magic vars 32134 1727204425.48388: variable 'ansible_version' from source: magic vars 32134 1727204425.48390: variable 'ansible_check_mode' from source: magic vars 32134 1727204425.48391: variable 'ansible_diff_mode' from source: magic vars 32134 1727204425.48392: variable 'ansible_forks' from source: magic vars 32134 1727204425.48393: variable 'ansible_inventory_sources' from source: magic vars 32134 1727204425.48394: variable 'ansible_skip_tags' from source: magic vars 32134 1727204425.48395: variable 'ansible_limit' from source: magic vars 32134 1727204425.48396: variable 'ansible_run_tags' from source: magic vars 32134 1727204425.48397: variable 'ansible_verbosity' from source: magic vars 32134 1727204425.48443: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 32134 1727204425.48561: in VariableManager get_vars() 32134 1727204425.48586: done with get_vars() 32134 1727204425.48643: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 32134 1727204425.48822: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 32134 1727204425.48940: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 32134 1727204425.49665: in VariableManager get_vars() 32134 1727204425.49697: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 32134 1727204425.51843: in VariableManager get_vars() 32134 1727204425.51847: done with get_vars() 32134 1727204425.51850: variable 'playbook_dir' from source: magic vars 32134 1727204425.51851: variable 'ansible_playbook_python' from source: magic vars 32134 1727204425.51852: variable 'ansible_config_file' from source: magic vars 32134 1727204425.51853: variable 'groups' from source: magic vars 32134 1727204425.51854: variable 'omit' from source: magic vars 32134 1727204425.51855: variable 'ansible_version' from source: magic vars 32134 1727204425.51856: variable 'ansible_check_mode' from source: magic vars 32134 1727204425.51857: variable 'ansible_diff_mode' from source: magic vars 32134 1727204425.51858: variable 'ansible_forks' from source: magic vars 32134 1727204425.51859: variable 'ansible_inventory_sources' from source: magic vars 32134 1727204425.51860: variable 'ansible_skip_tags' from source: magic vars 32134 1727204425.51861: variable 'ansible_limit' from source: magic vars 32134 1727204425.51862: variable 'ansible_run_tags' from source: magic vars 32134 1727204425.51863: variable 'ansible_verbosity' from source: magic vars 32134 1727204425.51912: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 32134 1727204425.52003: in VariableManager get_vars() 32134 1727204425.52016: done with get_vars() 32134 1727204425.52066: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 32134 1727204425.52196: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 32134 1727204425.54018: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 32134 1727204425.54556: in VariableManager get_vars() 32134 1727204425.54582: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 32134 1727204425.56604: in VariableManager get_vars() 32134 1727204425.56622: done with get_vars() 32134 1727204425.56668: in VariableManager get_vars() 32134 1727204425.56684: done with get_vars() 32134 1727204425.56731: in VariableManager get_vars() 32134 1727204425.56761: done with get_vars() 32134 1727204425.56807: in VariableManager get_vars() 32134 1727204425.56821: done with get_vars() 32134 1727204425.56897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 32134 1727204425.56914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 32134 1727204425.57191: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 32134 1727204425.57410: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 32134 1727204425.57414: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 32134 1727204425.57450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 32134 1727204425.57482: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 32134 1727204425.57740: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 32134 1727204425.57825: Loaded config def from plugin (callback/default) 32134 1727204425.57828: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 32134 1727204425.59265: Loaded config def from plugin (callback/junit) 32134 1727204425.59268: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 32134 1727204425.59325: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 32134 1727204425.59419: Loaded config def from plugin (callback/minimal) 32134 1727204425.59422: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 32134 1727204425.59471: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 32134 1727204425.59552: Loaded config def from plugin (callback/tree) 32134 1727204425.59555: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 32134 1727204425.59722: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 32134 1727204425.59725: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ipv6_disabled_nm.yml ******************************************* 5 plays in /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml 32134 1727204425.59758: in VariableManager get_vars() 32134 1727204425.59774: done with get_vars() 32134 1727204425.59781: in VariableManager get_vars() 32134 1727204425.59794: done with get_vars() 32134 1727204425.59800: variable 'omit' from source: magic vars 32134 1727204425.59848: in VariableManager get_vars() 32134 1727204425.59865: done with get_vars() 32134 1727204425.59893: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ipv6_disabled.yml' with nm as provider] **** 32134 1727204425.60550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 32134 1727204425.60641: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 32134 1727204425.60674: getting the remaining hosts for this loop 32134 1727204425.60676: done getting the remaining hosts for this loop 32134 1727204425.60680: getting the next task for host managed-node2 32134 1727204425.60685: done getting next task for host managed-node2 32134 1727204425.60687: ^ task is: TASK: Gathering Facts 32134 1727204425.60691: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204425.60695: getting variables 32134 1727204425.60696: in VariableManager get_vars() 32134 1727204425.60707: Calling all_inventory to load vars for managed-node2 32134 1727204425.60711: Calling groups_inventory to load vars for managed-node2 32134 1727204425.60714: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204425.60729: Calling all_plugins_play to load vars for managed-node2 32134 1727204425.60743: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204425.60748: Calling groups_plugins_play to load vars for managed-node2 32134 1727204425.60794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204425.60861: done with get_vars() 32134 1727204425.60869: done getting variables 32134 1727204425.60954: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:6 Tuesday 24 September 2024 15:00:25 -0400 (0:00:00.013) 0:00:00.013 ***** 32134 1727204425.60979: entering _queue_task() for managed-node2/gather_facts 32134 1727204425.60981: Creating lock for gather_facts 32134 1727204425.61380: worker is 1 (out of 1 available) 32134 1727204425.61393: exiting _queue_task() for managed-node2/gather_facts 32134 1727204425.61407: done queuing things up, now waiting for results queue to drain 32134 1727204425.61412: waiting for pending results... 32134 1727204425.61697: running TaskExecutor() for managed-node2/TASK: Gathering Facts 32134 1727204425.61797: in run() - task 12b410aa-8751-753f-5162-0000000000a3 32134 1727204425.61895: variable 'ansible_search_path' from source: unknown 32134 1727204425.61900: calling self._execute() 32134 1727204425.61962: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204425.61975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204425.62005: variable 'omit' from source: magic vars 32134 1727204425.62141: variable 'omit' from source: magic vars 32134 1727204425.62182: variable 'omit' from source: magic vars 32134 1727204425.62248: variable 'omit' from source: magic vars 32134 1727204425.62303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204425.62366: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204425.62395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204425.62424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204425.62548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204425.62553: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204425.62556: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204425.62558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204425.62661: Set connection var ansible_timeout to 10 32134 1727204425.62692: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204425.62701: Set connection var ansible_connection to ssh 32134 1727204425.62766: Set connection var ansible_shell_type to sh 32134 1727204425.62769: Set connection var ansible_shell_executable to /bin/sh 32134 1727204425.62772: Set connection var ansible_pipelining to False 32134 1727204425.62774: variable 'ansible_shell_executable' from source: unknown 32134 1727204425.62781: variable 'ansible_connection' from source: unknown 32134 1727204425.62785: variable 'ansible_module_compression' from source: unknown 32134 1727204425.62796: variable 'ansible_shell_type' from source: unknown 32134 1727204425.62804: variable 'ansible_shell_executable' from source: unknown 32134 1727204425.62815: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204425.62825: variable 'ansible_pipelining' from source: unknown 32134 1727204425.62833: variable 'ansible_timeout' from source: unknown 32134 1727204425.62842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204425.63103: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204425.63197: variable 'omit' from source: magic vars 32134 1727204425.63208: starting attempt loop 32134 1727204425.63211: running the handler 32134 1727204425.63216: variable 'ansible_facts' from source: unknown 32134 1727204425.63219: _low_level_execute_command(): starting 32134 1727204425.63221: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204425.64202: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204425.64206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 32134 1727204425.64258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204425.64310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204425.66083: stdout chunk (state=3): >>>/root <<< 32134 1727204425.66205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204425.66321: stderr chunk (state=3): >>><<< 32134 1727204425.66325: stdout chunk (state=3): >>><<< 32134 1727204425.66349: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204425.66467: _low_level_execute_command(): starting 32134 1727204425.66471: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204425.6635704-32208-147712669774396 `" && echo ansible-tmp-1727204425.6635704-32208-147712669774396="` echo /root/.ansible/tmp/ansible-tmp-1727204425.6635704-32208-147712669774396 `" ) && sleep 0' 32134 1727204425.67105: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204425.67152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204425.67278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204425.67308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204425.67401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204425.69569: stdout chunk (state=3): >>>ansible-tmp-1727204425.6635704-32208-147712669774396=/root/.ansible/tmp/ansible-tmp-1727204425.6635704-32208-147712669774396 <<< 32134 1727204425.69780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204425.69783: stdout chunk (state=3): >>><<< 32134 1727204425.69786: stderr chunk (state=3): >>><<< 32134 1727204425.69809: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204425.6635704-32208-147712669774396=/root/.ansible/tmp/ansible-tmp-1727204425.6635704-32208-147712669774396 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204425.69887: variable 'ansible_module_compression' from source: unknown 32134 1727204425.69941: ANSIBALLZ: Using generic lock for ansible.legacy.setup 32134 1727204425.69950: ANSIBALLZ: Acquiring lock 32134 1727204425.70004: ANSIBALLZ: Lock acquired: 140589353832608 32134 1727204425.70007: ANSIBALLZ: Creating module 32134 1727204426.29503: ANSIBALLZ: Writing module into payload 32134 1727204426.29507: ANSIBALLZ: Writing module 32134 1727204426.29716: ANSIBALLZ: Renaming module 32134 1727204426.29723: ANSIBALLZ: Done creating module 32134 1727204426.29747: variable 'ansible_facts' from source: unknown 32134 1727204426.29754: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204426.29774: _low_level_execute_command(): starting 32134 1727204426.29777: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 32134 1727204426.30664: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204426.30759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204426.30781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204426.30984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204426.31005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204426.33453: stdout chunk (state=3): >>>PLATFORM <<< 32134 1727204426.33579: stdout chunk (state=3): >>>Linux <<< 32134 1727204426.33609: stdout chunk (state=3): >>>FOUND <<< 32134 1727204426.33649: stdout chunk (state=3): >>>/usr/bin/python3.12 <<< 32134 1727204426.33653: stdout chunk (state=3): >>>/usr/bin/python3 <<< 32134 1727204426.33677: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 32134 1727204426.33956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204426.33960: stdout chunk (state=3): >>><<< 32134 1727204426.33962: stderr chunk (state=3): >>><<< 32134 1727204426.34111: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204426.34116 [managed-node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 32134 1727204426.34119: _low_level_execute_command(): starting 32134 1727204426.34122: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 32134 1727204426.34215: Sending initial data 32134 1727204426.34227: Sent initial data (1181 bytes) 32134 1727204426.34787: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204426.34830: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204426.34875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204426.34925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204426.35031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204426.40619: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 32134 1727204426.41298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204426.41301: stdout chunk (state=3): >>><<< 32134 1727204426.41304: stderr chunk (state=3): >>><<< 32134 1727204426.41306: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204426.41309: variable 'ansible_facts' from source: unknown 32134 1727204426.41311: variable 'ansible_facts' from source: unknown 32134 1727204426.41329: variable 'ansible_module_compression' from source: unknown 32134 1727204426.41378: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 32134 1727204426.41432: variable 'ansible_facts' from source: unknown 32134 1727204426.41624: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204425.6635704-32208-147712669774396/AnsiballZ_setup.py 32134 1727204426.41819: Sending initial data 32134 1727204426.41823: Sent initial data (154 bytes) 32134 1727204426.42528: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 32134 1727204426.42608: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204426.42627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204426.42702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204426.45102: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 32134 1727204426.45128: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204426.45172: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204426.45234: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpyj3_jsko /root/.ansible/tmp/ansible-tmp-1727204425.6635704-32208-147712669774396/AnsiballZ_setup.py <<< 32134 1727204426.45237: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204425.6635704-32208-147712669774396/AnsiballZ_setup.py" <<< 32134 1727204426.45281: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpyj3_jsko" to remote "/root/.ansible/tmp/ansible-tmp-1727204425.6635704-32208-147712669774396/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204425.6635704-32208-147712669774396/AnsiballZ_setup.py" <<< 32134 1727204426.48161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204426.48165: stdout chunk (state=3): >>><<< 32134 1727204426.48167: stderr chunk (state=3): >>><<< 32134 1727204426.48380: done transferring module to remote 32134 1727204426.48385: _low_level_execute_command(): starting 32134 1727204426.48387: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204425.6635704-32208-147712669774396/ /root/.ansible/tmp/ansible-tmp-1727204425.6635704-32208-147712669774396/AnsiballZ_setup.py && sleep 0' 32134 1727204426.49690: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204426.49921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204426.50025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204426.50057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204426.50081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204426.50269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204426.53112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204426.53116: stdout chunk (state=3): >>><<< 32134 1727204426.53118: stderr chunk (state=3): >>><<< 32134 1727204426.53397: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204426.53402: _low_level_execute_command(): starting 32134 1727204426.53405: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204425.6635704-32208-147712669774396/AnsiballZ_setup.py && sleep 0' 32134 1727204426.54311: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204426.54335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204426.54349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204426.54364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204426.54480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204426.57804: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 32134 1727204426.57873: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # <<< 32134 1727204426.57896: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 32134 1727204426.58035: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # <<< 32134 1727204426.58082: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 32134 1727204426.58113: stdout chunk (state=3): >>># installing zipimport hook <<< 32134 1727204426.58116: stdout chunk (state=3): >>>import 'time' # <<< 32134 1727204426.58136: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 32134 1727204426.58303: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 32134 1727204426.58339: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 32134 1727204426.58343: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c200c4d0> <<< 32134 1727204426.58373: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1fdbad0> <<< 32134 1727204426.58379: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 32134 1727204426.58409: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c200ea20> <<< 32134 1727204426.58432: stdout chunk (state=3): >>>import '_signal' # <<< 32134 1727204426.58450: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 32134 1727204426.58473: stdout chunk (state=3): >>>import 'io' # <<< 32134 1727204426.58525: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 32134 1727204426.58694: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # <<< 32134 1727204426.58698: stdout chunk (state=3): >>>import 'posixpath' # <<< 32134 1727204426.58728: stdout chunk (state=3): >>>import 'os' # <<< 32134 1727204426.58767: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages <<< 32134 1727204426.58787: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 32134 1727204426.58808: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 32134 1727204426.58835: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 32134 1727204426.58860: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1dbd0a0> <<< 32134 1727204426.58958: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204426.58998: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1dbdfd0> import 'site' # <<< 32134 1727204426.59039: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 32134 1727204426.59646: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 32134 1727204426.59706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 32134 1727204426.59709: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204426.59924: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1dfbdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 32134 1727204426.59928: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 32134 1727204426.59957: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1dfbfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 32134 1727204426.60020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 32134 1727204426.60105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204426.60157: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e33800> <<< 32134 1727204426.60173: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 32134 1727204426.60187: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e33e90> import '_collections' # <<< 32134 1727204426.60261: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e13aa0> <<< 32134 1727204426.60546: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e111c0> <<< 32134 1727204426.60787: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1df8f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 32134 1727204426.60792: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e576e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e56300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e121b0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e54bf0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e88710> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1df8200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 32134 1727204426.60795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204426.60843: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1e88bc0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e88a70> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1e88e60> <<< 32134 1727204426.60885: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1df6d20> <<< 32134 1727204426.60900: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 32134 1727204426.60956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e89520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e891f0> <<< 32134 1727204426.61115: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e8a420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 32134 1727204426.61139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1ea4650> import 'errno' # <<< 32134 1727204426.61244: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1ea5d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 32134 1727204426.61324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1ea6c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1ea72f0> <<< 32134 1727204426.61407: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1ea61e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 32134 1727204426.61456: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1ea7d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1ea74a0> <<< 32134 1727204426.61460: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e8a480> <<< 32134 1727204426.61525: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 32134 1727204426.61530: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 32134 1727204426.61576: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 32134 1727204426.61657: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1bafce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 32134 1727204426.61700: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1bd87d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1bd8530> <<< 32134 1727204426.61731: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1bd8800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1bd89e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1bade80> <<< 32134 1727204426.61748: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 32134 1727204426.61986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 32134 1727204426.62018: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1bda0f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1bd8d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e8ab70> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 32134 1727204426.62044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 32134 1727204426.62097: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 32134 1727204426.62133: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1c024b0> <<< 32134 1727204426.62173: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 32134 1727204426.62269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 32134 1727204426.62273: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 32134 1727204426.62300: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1c1e600> <<< 32134 1727204426.62359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 32134 1727204426.62476: stdout chunk (state=3): >>>import 'ntpath' # <<< 32134 1727204426.62481: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1c533b0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 32134 1727204426.62795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 32134 1727204426.62803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 32134 1727204426.62843: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1c79b20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1c534d0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1c1f290> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1a944d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1c1d640> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1bdb050> <<< 32134 1727204426.63007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 32134 1727204426.63022: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f40c1a94770> <<< 32134 1727204426.63199: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_1rar35o3/ansible_ansible.legacy.setup_payload.zip' <<< 32134 1727204426.63224: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.63515: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 32134 1727204426.63624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 32134 1727204426.63669: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1afe1e0> import '_typing' # <<< 32134 1727204426.63967: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1ad5100> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1ad4260> <<< 32134 1727204426.64010: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.64049: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 32134 1727204426.64301: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 32134 1727204426.66696: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.68126: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 32134 1727204426.68140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc'<<< 32134 1727204426.68306: stdout chunk (state=3): >>> import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1ad78c0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1b31b80> <<< 32134 1727204426.68346: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1b31940> <<< 32134 1727204426.68417: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1b312b0><<< 32134 1727204426.68458: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 32134 1727204426.68484: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 32134 1727204426.68533: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1b316d0><<< 32134 1727204426.68564: stdout chunk (state=3): >>> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1afec00> import 'atexit' # <<< 32134 1727204426.68610: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204426.68663: stdout chunk (state=3): >>> import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1b32960> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204426.68666: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204426.68717: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1b32ba0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 32134 1727204426.68869: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 32134 1727204426.68906: stdout chunk (state=3): >>>import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1b330e0> <<< 32134 1727204426.68929: stdout chunk (state=3): >>>import 'pwd' # <<< 32134 1727204426.68961: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py<<< 32134 1727204426.68980: stdout chunk (state=3): >>> <<< 32134 1727204426.69031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 32134 1727204426.69080: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1994e30><<< 32134 1727204426.69103: stdout chunk (state=3): >>> <<< 32134 1727204426.69148: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204426.69152: stdout chunk (state=3): >>> # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1996a50><<< 32134 1727204426.69171: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 32134 1727204426.69224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 32134 1727204426.69302: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1997350> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 32134 1727204426.69382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 32134 1727204426.69424: stdout chunk (state=3): >>> import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1998500> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 32134 1727204426.69543: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py<<< 32134 1727204426.69579: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc'<<< 32134 1727204426.69598: stdout chunk (state=3): >>> <<< 32134 1727204426.69641: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c199aff0> <<< 32134 1727204426.69730: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204426.69743: stdout chunk (state=3): >>>import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c199b110> <<< 32134 1727204426.69772: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19992b0><<< 32134 1727204426.69822: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 32134 1727204426.69859: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 32134 1727204426.69915: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc'<<< 32134 1727204426.69948: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 32134 1727204426.69992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 32134 1727204426.70043: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 32134 1727204426.70086: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 32134 1727204426.70092: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c199ef90> import '_tokenize' # <<< 32134 1727204426.70196: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c199da60><<< 32134 1727204426.70246: stdout chunk (state=3): >>> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c199d7c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 32134 1727204426.70282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 32134 1727204426.70444: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c199fc50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19997c0> <<< 32134 1727204426.70515: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204426.70533: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19e30e0> <<< 32134 1727204426.70704: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19e3290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 32134 1727204426.70930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19e8e30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19e8bf0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19eb3e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19e9520> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 32134 1727204426.70971: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204426.71004: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 32134 1727204426.71036: stdout chunk (state=3): >>>import '_string' # <<< 32134 1727204426.71064: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19f2bd0> <<< 32134 1727204426.71226: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19eb560> <<< 32134 1727204426.71302: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204426.71335: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19f3a40> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204426.71403: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19f3a70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19f3ef0> <<< 32134 1727204426.71499: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19e3560> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 32134 1727204426.71534: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204426.71570: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19f7680> <<< 32134 1727204426.71797: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19f8770> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19f5e20> <<< 32134 1727204426.71924: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19f7170> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19f5a00> # zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.71927: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 32134 1727204426.71938: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.71964: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.72078: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.72130: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 32134 1727204426.72207: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 32134 1727204426.72284: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.72428: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.73137: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.74054: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 32134 1727204426.74058: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 32134 1727204426.74092: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c18808c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c18815e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19f87d0> <<< 32134 1727204426.74161: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 32134 1727204426.74194: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 32134 1727204426.74211: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.74383: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.74595: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 32134 1727204426.74680: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1881610> # zipimport: zlib available <<< 32134 1727204426.75181: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.75714: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.75803: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.75927: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 32134 1727204426.75945: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.75988: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 32134 1727204426.76083: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.76087: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.76200: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 32134 1727204426.76246: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 32134 1727204426.76317: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.76351: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 32134 1727204426.76669: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.76930: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 32134 1727204426.77022: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 32134 1727204426.77125: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1883c50> <<< 32134 1727204426.77295: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.77352: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 32134 1727204426.77356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 32134 1727204426.77435: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204426.77561: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c188a030> <<< 32134 1727204426.77628: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c188a9c0> <<< 32134 1727204426.77665: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1882e70> # zipimport: zlib available <<< 32134 1727204426.77693: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.77736: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 32134 1727204426.77762: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.77794: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.77838: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.77905: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.77979: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 32134 1727204426.78027: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204426.78129: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1889790> <<< 32134 1727204426.78163: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c188abd0> <<< 32134 1727204426.78209: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 32134 1727204426.78281: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.78348: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.78375: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.78507: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204426.78542: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 32134 1727204426.78563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 32134 1727204426.78587: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 32134 1727204426.78655: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1922de0> <<< 32134 1727204426.78709: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1894b60> <<< 32134 1727204426.78798: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c188ec00> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c188ea50> # destroy ansible.module_utils.distro <<< 32134 1727204426.78962: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 32134 1727204426.78980: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 32134 1727204426.79009: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 32134 1727204426.79036: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.79119: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.79146: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.79192: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.79241: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.79273: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.79330: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 32134 1727204426.79433: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.79441: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.79568: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.79572: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 32134 1727204426.79770: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.79963: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.80132: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.80137: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 32134 1727204426.80162: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 32134 1727204426.80180: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1925b50> <<< 32134 1727204426.80206: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 32134 1727204426.80269: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 32134 1727204426.80333: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 32134 1727204426.80373: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 32134 1727204426.80385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1364380> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c13648f0> <<< 32134 1727204426.80527: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19053d0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19044d0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1924230> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1927d10> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 32134 1727204426.80595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 32134 1727204426.80636: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 32134 1727204426.80658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 32134 1727204426.80696: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1367680> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1366f30> <<< 32134 1727204426.80768: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1367110> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1366360> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 32134 1727204426.80971: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 32134 1727204426.80978: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c13677a0> <<< 32134 1727204426.81062: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 32134 1727204426.81077: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c13d22a0> <<< 32134 1727204426.81084: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c13d02c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1925310> import 'ansible.module_utils.facts.timeout' # <<< 32134 1727204426.81233: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.81253: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 32134 1727204426.81272: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.81332: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.81410: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 32134 1727204426.81472: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.81525: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 32134 1727204426.81616: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 32134 1727204426.81903: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.81927: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.82007: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 32134 1727204426.82532: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.83041: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 32134 1727204426.83098: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.83154: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.83194: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.83234: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 32134 1727204426.83272: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.83307: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 32134 1727204426.83455: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.83493: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.83519: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 32134 1727204426.83551: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.83664: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 32134 1727204426.83709: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.83825: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 32134 1727204426.83829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c13d2450> <<< 32134 1727204426.83917: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 32134 1727204426.84034: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c13d3050> <<< 32134 1727204426.84038: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 32134 1727204426.84122: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.84161: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 32134 1727204426.84232: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.84261: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.84368: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 32134 1727204426.84439: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.84457: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.84623: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.84627: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 32134 1727204426.84674: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 32134 1727204426.84749: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204426.84827: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c13fe510> <<< 32134 1727204426.85050: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c13ea360> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 32134 1727204426.85176: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 32134 1727204426.85282: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.85361: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.85503: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.85716: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.85751: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 32134 1727204426.85772: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.85799: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.85861: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 32134 1727204426.85894: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204426.85973: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c141a090> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1419cd0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 32134 1727204426.86014: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.86080: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 32134 1727204426.86159: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.86291: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.86421: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 32134 1727204426.86435: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.86643: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.86688: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.86755: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 32134 1727204426.86776: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.86808: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.86956: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.87108: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 32134 1727204426.87169: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.87258: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.87493: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 32134 1727204426.87523: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.88108: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.88696: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 32134 1727204426.88735: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.88828: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.88949: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 32134 1727204426.89067: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.89346: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 32134 1727204426.89364: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.89523: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 32134 1727204426.89555: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 32134 1727204426.89603: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.89655: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 32134 1727204426.89687: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.89762: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.89883: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.90195: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.90345: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 32134 1727204426.90383: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.90446: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 32134 1727204426.90568: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 32134 1727204426.90571: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.90586: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.90695: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 32134 1727204426.90703: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.90840: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.90856: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 32134 1727204426.90918: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.91025: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 32134 1727204426.91028: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.91334: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.91593: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 32134 1727204426.91608: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.91668: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.91767: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 32134 1727204426.91771: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.92006: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.92029: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 32134 1727204426.92054: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.92159: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 32134 1727204426.92187: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 32134 1727204426.92245: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.92280: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 32134 1727204426.92348: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.92377: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.92438: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.92464: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.92516: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.92605: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 32134 1727204426.92692: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 32134 1727204426.92715: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32134 1727204426.92737: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 32134 1727204426.92959: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.93207: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 32134 1727204426.93295: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # <<< 32134 1727204426.93316: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.93350: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.93403: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 32134 1727204426.93422: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.93497: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.93583: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 32134 1727204426.93607: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 32134 1727204426.93698: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.93798: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 32134 1727204426.94006: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204426.94671: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 32134 1727204426.94691: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 32134 1727204426.94717: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 32134 1727204426.94769: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c0cc3fb0> <<< 32134 1727204426.94783: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c0cc17f0> <<< 32134 1727204426.94819: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c0cc1b80> <<< 32134 1727204427.09136: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c0d08530> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c0d09970> <<< 32134 1727204427.09374: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c13f7c50> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c0d0b740> <<< 32134 1727204427.09676: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 32134 1727204427.09773: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 32134 1727204427.35500: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "26", "epoch": "1727204426", "epoch_int": "1727204426", "date": "2024-09-24", "time": "15:00:26", "iso8601_micro": "2024-09-24T19:00:26.945729Z", "iso8601": "2024-09-24T19:00:26Z", "iso8601_basic": "20240924T150026945729", "iso8601_basic_short": "20240924T150026", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNp<<< 32134 1727204427.35517: stdout chunk (state=3): >>>QPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2829, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 888, "free": 2829}, "nocache": {"free": 3467, "used": 250}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 930, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251144622080, "block_size": 4096, "block_total": 64479564, "block_available": 61314605, "block_used": 3164959, "inode_total": 16384000, "inode_available": 16302233, "inode_used": 81767, "uu<<< 32134 1727204427.35527: stdout chunk (state=3): >>>id": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segment<<< 32134 1727204427.35530: stdout chunk (state=3): >>>ation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_loadavg": {"1m": 0.68505859375, "5m": 0.69384765625, "15m": 0.46630859375}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_is_chroot": false, "ansible_hostnqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 32134 1727204427.36456: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil <<< 32134 1727204427.36470: stdout chunk (state=3): >>># cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible <<< 32134 1727204427.36699: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 32134 1727204427.36704: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy <<< 32134 1727204427.36708: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 32134 1727204427.36833: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 32134 1727204427.37175: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 32134 1727204427.37193: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 32134 1727204427.37236: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 32134 1727204427.37240: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma<<< 32134 1727204427.37262: stdout chunk (state=3): >>> # destroy zipfile._path <<< 32134 1727204427.37338: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 32134 1727204427.37396: stdout chunk (state=3): >>># destroy ntpath <<< 32134 1727204427.37416: stdout chunk (state=3): >>># destroy importlib <<< 32134 1727204427.37421: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 32134 1727204427.37466: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 32134 1727204427.37470: stdout chunk (state=3): >>># destroy _json <<< 32134 1727204427.37501: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 32134 1727204427.37510: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 32134 1727204427.37674: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 32134 1727204427.37740: stdout chunk (state=3): >>># destroy _pickle <<< 32134 1727204427.37744: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue <<< 32134 1727204427.37784: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors <<< 32134 1727204427.37820: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 32134 1727204427.37824: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 <<< 32134 1727204427.37826: stdout chunk (state=3): >>># destroy _ssl <<< 32134 1727204427.37829: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 32134 1727204427.37831: stdout chunk (state=3): >>># destroy json <<< 32134 1727204427.37833: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 32134 1727204427.37835: stdout chunk (state=3): >>># destroy glob # destroy fnmatch <<< 32134 1727204427.37892: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 32134 1727204427.37922: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 32134 1727204427.37936: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 32134 1727204427.37939: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 32134 1727204427.37997: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2<<< 32134 1727204427.38001: stdout chunk (state=3): >>> # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 32134 1727204427.38107: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 32134 1727204427.38113: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 32134 1727204427.38184: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 32134 1727204427.38222: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 32134 1727204427.38538: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 32134 1727204427.38542: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 32134 1727204427.38576: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 32134 1727204427.38669: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 32134 1727204427.38692: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 32134 1727204427.38711: stdout chunk (state=3): >>># destroy time <<< 32134 1727204427.38746: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 32134 1727204427.38862: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 32134 1727204427.39267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204427.39291: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 32134 1727204427.39443: stderr chunk (state=3): >>><<< 32134 1727204427.39453: stdout chunk (state=3): >>><<< 32134 1727204427.39707: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c200c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1fdbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c200ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1dbd0a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1dbdfd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1dfbdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1dfbfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e33800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e33e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e13aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e111c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1df8f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e576e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e56300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e121b0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e54bf0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e88710> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1df8200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1e88bc0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e88a70> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1e88e60> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1df6d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e89520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e891f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e8a420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1ea4650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1ea5d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1ea6c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1ea72f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1ea61e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1ea7d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1ea74a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e8a480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1bafce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1bd87d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1bd8530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1bd8800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1bd89e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1bade80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1bda0f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1bd8d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1e8ab70> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1c024b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1c1e600> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1c533b0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1c79b20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1c534d0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1c1f290> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1a944d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1c1d640> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1bdb050> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f40c1a94770> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_1rar35o3/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1afe1e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1ad5100> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1ad4260> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1ad78c0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1b31b80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1b31940> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1b312b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1b316d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1afec00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1b32960> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1b32ba0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1b330e0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1994e30> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1996a50> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1997350> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1998500> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c199aff0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c199b110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19992b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c199ef90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c199da60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c199d7c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c199fc50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19997c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19e30e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19e3290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19e8e30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19e8bf0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19eb3e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19e9520> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19f2bd0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19eb560> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19f3a40> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19f3a70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19f3ef0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19e3560> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19f7680> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19f8770> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19f5e20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c19f7170> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19f5a00> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c18808c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c18815e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19f87d0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1881610> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1883c50> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c188a030> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c188a9c0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1882e70> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1889790> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c188abd0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1922de0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1894b60> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c188ec00> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c188ea50> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1925b50> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1364380> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c13648f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19053d0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c19044d0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1924230> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1927d10> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1367680> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1366f30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c1367110> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1366360> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c13677a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c13d22a0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c13d02c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1925310> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c13d2450> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c13d3050> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c13fe510> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c13ea360> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c141a090> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c1419cd0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40c0cc3fb0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c0cc17f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c0cc1b80> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c0d08530> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c0d09970> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c13f7c50> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40c0d0b740> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "26", "epoch": "1727204426", "epoch_int": "1727204426", "date": "2024-09-24", "time": "15:00:26", "iso8601_micro": "2024-09-24T19:00:26.945729Z", "iso8601": "2024-09-24T19:00:26Z", "iso8601_basic": "20240924T150026945729", "iso8601_basic_short": "20240924T150026", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2829, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 888, "free": 2829}, "nocache": {"free": 3467, "used": 250}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 930, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251144622080, "block_size": 4096, "block_total": 64479564, "block_available": 61314605, "block_used": 3164959, "inode_total": 16384000, "inode_available": 16302233, "inode_used": 81767, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_loadavg": {"1m": 0.68505859375, "5m": 0.69384765625, "15m": 0.46630859375}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_is_chroot": false, "ansible_hostnqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 32134 1727204427.42425: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204425.6635704-32208-147712669774396/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204427.42441: _low_level_execute_command(): starting 32134 1727204427.42447: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204425.6635704-32208-147712669774396/ > /dev/null 2>&1 && sleep 0' 32134 1727204427.43031: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204427.43034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204427.43037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204427.43040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204427.43042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204427.43084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204427.43118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204427.45721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204427.45775: stderr chunk (state=3): >>><<< 32134 1727204427.45779: stdout chunk (state=3): >>><<< 32134 1727204427.45796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204427.45806: handler run complete 32134 1727204427.45929: variable 'ansible_facts' from source: unknown 32134 1727204427.46022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204427.46372: variable 'ansible_facts' from source: unknown 32134 1727204427.46598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204427.46715: attempt loop complete, returning result 32134 1727204427.46727: _execute() done 32134 1727204427.46736: dumping result to json 32134 1727204427.46781: done dumping result, returning 32134 1727204427.46797: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-753f-5162-0000000000a3] 32134 1727204427.46807: sending task result for task 12b410aa-8751-753f-5162-0000000000a3 32134 1727204427.47782: done sending task result for task 12b410aa-8751-753f-5162-0000000000a3 32134 1727204427.47786: WORKER PROCESS EXITING ok: [managed-node2] 32134 1727204427.48377: no more pending results, returning what we have 32134 1727204427.48380: results queue empty 32134 1727204427.48381: checking for any_errors_fatal 32134 1727204427.48383: done checking for any_errors_fatal 32134 1727204427.48384: checking for max_fail_percentage 32134 1727204427.48385: done checking for max_fail_percentage 32134 1727204427.48386: checking to see if all hosts have failed and the running result is not ok 32134 1727204427.48387: done checking to see if all hosts have failed 32134 1727204427.48388: getting the remaining hosts for this loop 32134 1727204427.48391: done getting the remaining hosts for this loop 32134 1727204427.48396: getting the next task for host managed-node2 32134 1727204427.48402: done getting next task for host managed-node2 32134 1727204427.48404: ^ task is: TASK: meta (flush_handlers) 32134 1727204427.48406: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204427.48410: getting variables 32134 1727204427.48412: in VariableManager get_vars() 32134 1727204427.48437: Calling all_inventory to load vars for managed-node2 32134 1727204427.48440: Calling groups_inventory to load vars for managed-node2 32134 1727204427.48444: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204427.48455: Calling all_plugins_play to load vars for managed-node2 32134 1727204427.48459: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204427.48463: Calling groups_plugins_play to load vars for managed-node2 32134 1727204427.48703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204427.48974: done with get_vars() 32134 1727204427.48988: done getting variables 32134 1727204427.49073: in VariableManager get_vars() 32134 1727204427.49085: Calling all_inventory to load vars for managed-node2 32134 1727204427.49088: Calling groups_inventory to load vars for managed-node2 32134 1727204427.49094: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204427.49099: Calling all_plugins_play to load vars for managed-node2 32134 1727204427.49102: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204427.49106: Calling groups_plugins_play to load vars for managed-node2 32134 1727204427.49320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204427.49587: done with get_vars() 32134 1727204427.49605: done queuing things up, now waiting for results queue to drain 32134 1727204427.49608: results queue empty 32134 1727204427.49609: checking for any_errors_fatal 32134 1727204427.49614: done checking for any_errors_fatal 32134 1727204427.49615: checking for max_fail_percentage 32134 1727204427.49617: done checking for max_fail_percentage 32134 1727204427.49617: checking to see if all hosts have failed and the running result is not ok 32134 1727204427.49623: done checking to see if all hosts have failed 32134 1727204427.49624: getting the remaining hosts for this loop 32134 1727204427.49625: done getting the remaining hosts for this loop 32134 1727204427.49628: getting the next task for host managed-node2 32134 1727204427.49634: done getting next task for host managed-node2 32134 1727204427.49637: ^ task is: TASK: Include the task 'el_repo_setup.yml' 32134 1727204427.49638: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204427.49641: getting variables 32134 1727204427.49642: in VariableManager get_vars() 32134 1727204427.49652: Calling all_inventory to load vars for managed-node2 32134 1727204427.49654: Calling groups_inventory to load vars for managed-node2 32134 1727204427.49657: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204427.49663: Calling all_plugins_play to load vars for managed-node2 32134 1727204427.49665: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204427.49668: Calling groups_plugins_play to load vars for managed-node2 32134 1727204427.49886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204427.50199: done with get_vars() 32134 1727204427.50210: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:11 Tuesday 24 September 2024 15:00:27 -0400 (0:00:01.893) 0:00:01.907 ***** 32134 1727204427.50306: entering _queue_task() for managed-node2/include_tasks 32134 1727204427.50309: Creating lock for include_tasks 32134 1727204427.50683: worker is 1 (out of 1 available) 32134 1727204427.50896: exiting _queue_task() for managed-node2/include_tasks 32134 1727204427.50906: done queuing things up, now waiting for results queue to drain 32134 1727204427.50909: waiting for pending results... 32134 1727204427.51043: running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' 32134 1727204427.51103: in run() - task 12b410aa-8751-753f-5162-000000000006 32134 1727204427.51129: variable 'ansible_search_path' from source: unknown 32134 1727204427.51181: calling self._execute() 32134 1727204427.51275: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204427.51358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204427.51363: variable 'omit' from source: magic vars 32134 1727204427.51452: _execute() done 32134 1727204427.51466: dumping result to json 32134 1727204427.51476: done dumping result, returning 32134 1727204427.51488: done running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' [12b410aa-8751-753f-5162-000000000006] 32134 1727204427.51502: sending task result for task 12b410aa-8751-753f-5162-000000000006 32134 1727204427.51749: done sending task result for task 12b410aa-8751-753f-5162-000000000006 32134 1727204427.51753: WORKER PROCESS EXITING 32134 1727204427.51804: no more pending results, returning what we have 32134 1727204427.51811: in VariableManager get_vars() 32134 1727204427.51850: Calling all_inventory to load vars for managed-node2 32134 1727204427.51854: Calling groups_inventory to load vars for managed-node2 32134 1727204427.51859: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204427.51876: Calling all_plugins_play to load vars for managed-node2 32134 1727204427.51880: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204427.51884: Calling groups_plugins_play to load vars for managed-node2 32134 1727204427.52318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204427.52629: done with get_vars() 32134 1727204427.52638: variable 'ansible_search_path' from source: unknown 32134 1727204427.52653: we have included files to process 32134 1727204427.52654: generating all_blocks data 32134 1727204427.52656: done generating all_blocks data 32134 1727204427.52657: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 32134 1727204427.52658: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 32134 1727204427.52661: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 32134 1727204427.53486: in VariableManager get_vars() 32134 1727204427.53508: done with get_vars() 32134 1727204427.53527: done processing included file 32134 1727204427.53529: iterating over new_blocks loaded from include file 32134 1727204427.53531: in VariableManager get_vars() 32134 1727204427.53545: done with get_vars() 32134 1727204427.53547: filtering new block on tags 32134 1727204427.53564: done filtering new block on tags 32134 1727204427.53567: in VariableManager get_vars() 32134 1727204427.53579: done with get_vars() 32134 1727204427.53581: filtering new block on tags 32134 1727204427.53601: done filtering new block on tags 32134 1727204427.53605: in VariableManager get_vars() 32134 1727204427.53620: done with get_vars() 32134 1727204427.53622: filtering new block on tags 32134 1727204427.53638: done filtering new block on tags 32134 1727204427.53641: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node2 32134 1727204427.53647: extending task lists for all hosts with included blocks 32134 1727204427.53709: done extending task lists 32134 1727204427.53710: done processing included files 32134 1727204427.53711: results queue empty 32134 1727204427.53715: checking for any_errors_fatal 32134 1727204427.53716: done checking for any_errors_fatal 32134 1727204427.53717: checking for max_fail_percentage 32134 1727204427.53719: done checking for max_fail_percentage 32134 1727204427.53719: checking to see if all hosts have failed and the running result is not ok 32134 1727204427.53720: done checking to see if all hosts have failed 32134 1727204427.53721: getting the remaining hosts for this loop 32134 1727204427.53723: done getting the remaining hosts for this loop 32134 1727204427.53726: getting the next task for host managed-node2 32134 1727204427.53730: done getting next task for host managed-node2 32134 1727204427.53733: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 32134 1727204427.53736: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204427.53738: getting variables 32134 1727204427.53739: in VariableManager get_vars() 32134 1727204427.53749: Calling all_inventory to load vars for managed-node2 32134 1727204427.53752: Calling groups_inventory to load vars for managed-node2 32134 1727204427.53779: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204427.53785: Calling all_plugins_play to load vars for managed-node2 32134 1727204427.53790: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204427.53795: Calling groups_plugins_play to load vars for managed-node2 32134 1727204427.54018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204427.54329: done with get_vars() 32134 1727204427.54340: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 15:00:27 -0400 (0:00:00.041) 0:00:01.948 ***** 32134 1727204427.54422: entering _queue_task() for managed-node2/setup 32134 1727204427.55172: worker is 1 (out of 1 available) 32134 1727204427.55181: exiting _queue_task() for managed-node2/setup 32134 1727204427.55193: done queuing things up, now waiting for results queue to drain 32134 1727204427.55195: waiting for pending results... 32134 1727204427.55708: running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 32134 1727204427.55777: in run() - task 12b410aa-8751-753f-5162-0000000000b4 32134 1727204427.55858: variable 'ansible_search_path' from source: unknown 32134 1727204427.55869: variable 'ansible_search_path' from source: unknown 32134 1727204427.56060: calling self._execute() 32134 1727204427.56122: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204427.56181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204427.56204: variable 'omit' from source: magic vars 32134 1727204427.57706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204427.61406: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204427.61508: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204427.61692: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204427.61697: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204427.61700: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204427.61747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204427.61788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204427.61834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204427.61895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204427.61926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204427.62157: variable 'ansible_facts' from source: unknown 32134 1727204427.62268: variable 'network_test_required_facts' from source: task vars 32134 1727204427.62325: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 32134 1727204427.62339: variable 'omit' from source: magic vars 32134 1727204427.62399: variable 'omit' from source: magic vars 32134 1727204427.62448: variable 'omit' from source: magic vars 32134 1727204427.62488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204427.62530: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204427.62557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204427.62590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204427.62608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204427.62649: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204427.62660: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204427.62681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204427.62805: Set connection var ansible_timeout to 10 32134 1727204427.62896: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204427.62899: Set connection var ansible_connection to ssh 32134 1727204427.62902: Set connection var ansible_shell_type to sh 32134 1727204427.62905: Set connection var ansible_shell_executable to /bin/sh 32134 1727204427.62908: Set connection var ansible_pipelining to False 32134 1727204427.62910: variable 'ansible_shell_executable' from source: unknown 32134 1727204427.62914: variable 'ansible_connection' from source: unknown 32134 1727204427.62917: variable 'ansible_module_compression' from source: unknown 32134 1727204427.62918: variable 'ansible_shell_type' from source: unknown 32134 1727204427.62926: variable 'ansible_shell_executable' from source: unknown 32134 1727204427.62935: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204427.62944: variable 'ansible_pipelining' from source: unknown 32134 1727204427.62953: variable 'ansible_timeout' from source: unknown 32134 1727204427.62962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204427.63145: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204427.63164: variable 'omit' from source: magic vars 32134 1727204427.63223: starting attempt loop 32134 1727204427.63227: running the handler 32134 1727204427.63230: _low_level_execute_command(): starting 32134 1727204427.63232: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204427.63997: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204427.64019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204427.64107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204427.64145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204427.64223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204427.66001: stdout chunk (state=3): >>>/root <<< 32134 1727204427.66208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204427.66212: stdout chunk (state=3): >>><<< 32134 1727204427.66217: stderr chunk (state=3): >>><<< 32134 1727204427.66347: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204427.66358: _low_level_execute_command(): starting 32134 1727204427.66362: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204427.6624367-32267-273482943208094 `" && echo ansible-tmp-1727204427.6624367-32267-273482943208094="` echo /root/.ansible/tmp/ansible-tmp-1727204427.6624367-32267-273482943208094 `" ) && sleep 0' 32134 1727204427.66948: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204427.66964: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204427.66978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204427.67059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204427.67122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204427.67172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204427.67233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204427.69908: stdout chunk (state=3): >>>ansible-tmp-1727204427.6624367-32267-273482943208094=/root/.ansible/tmp/ansible-tmp-1727204427.6624367-32267-273482943208094 <<< 32134 1727204427.70011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204427.70067: stderr chunk (state=3): >>><<< 32134 1727204427.70069: stdout chunk (state=3): >>><<< 32134 1727204427.70095: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204427.6624367-32267-273482943208094=/root/.ansible/tmp/ansible-tmp-1727204427.6624367-32267-273482943208094 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204427.70142: variable 'ansible_module_compression' from source: unknown 32134 1727204427.70180: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 32134 1727204427.70233: variable 'ansible_facts' from source: unknown 32134 1727204427.70348: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204427.6624367-32267-273482943208094/AnsiballZ_setup.py 32134 1727204427.70468: Sending initial data 32134 1727204427.70471: Sent initial data (154 bytes) 32134 1727204427.70932: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204427.70936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204427.70939: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204427.70941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204427.70943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204427.70998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204427.71002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204427.71063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204427.73200: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204427.73248: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204427.73288: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpt9none6z /root/.ansible/tmp/ansible-tmp-1727204427.6624367-32267-273482943208094/AnsiballZ_setup.py <<< 32134 1727204427.73302: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204427.6624367-32267-273482943208094/AnsiballZ_setup.py" <<< 32134 1727204427.73320: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpt9none6z" to remote "/root/.ansible/tmp/ansible-tmp-1727204427.6624367-32267-273482943208094/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204427.6624367-32267-273482943208094/AnsiballZ_setup.py" <<< 32134 1727204427.75299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204427.75345: stderr chunk (state=3): >>><<< 32134 1727204427.75348: stdout chunk (state=3): >>><<< 32134 1727204427.75371: done transferring module to remote 32134 1727204427.75385: _low_level_execute_command(): starting 32134 1727204427.75393: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204427.6624367-32267-273482943208094/ /root/.ansible/tmp/ansible-tmp-1727204427.6624367-32267-273482943208094/AnsiballZ_setup.py && sleep 0' 32134 1727204427.76045: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204427.76153: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204427.76157: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204427.76188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204427.76220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204427.76262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204427.76324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204427.79031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204427.79087: stderr chunk (state=3): >>><<< 32134 1727204427.79093: stdout chunk (state=3): >>><<< 32134 1727204427.79110: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204427.79116: _low_level_execute_command(): starting 32134 1727204427.79119: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204427.6624367-32267-273482943208094/AnsiballZ_setup.py && sleep 0' 32134 1727204427.79597: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204427.79602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204427.79604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204427.79607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204427.79611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204427.79664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204427.79669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204427.79716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204427.83162: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 32134 1727204427.83406: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 32134 1727204427.83437: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204427.83463: stdout chunk (state=3): >>>import '_codecs' # <<< 32134 1727204427.83508: stdout chunk (state=3): >>>import 'codecs' # <<< 32134 1727204427.83553: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 32134 1727204427.83608: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64636d44d0> <<< 32134 1727204427.83658: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64636a3ad0> <<< 32134 1727204427.83662: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 32134 1727204427.83683: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64636d6a20> <<< 32134 1727204427.83746: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 32134 1727204427.83761: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 32134 1727204427.83815: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 32134 1727204427.84006: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 32134 1727204427.84065: stdout chunk (state=3): >>>import 'os' # <<< 32134 1727204427.84083: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 32134 1727204427.84125: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 32134 1727204427.84128: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 32134 1727204427.84178: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 32134 1727204427.84208: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64634c50a0> <<< 32134 1727204427.84328: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204427.84331: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64634c5fd0> <<< 32134 1727204427.84408: stdout chunk (state=3): >>>import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 32134 1727204427.85078: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 32134 1727204427.85094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 32134 1727204427.85123: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204427.85152: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 32134 1727204427.85233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 32134 1727204427.85268: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 32134 1727204427.85319: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463503e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 32134 1727204427.85356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 32134 1727204427.85360: stdout chunk (state=3): >>>import '_operator' # <<< 32134 1727204427.85396: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463503f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 32134 1727204427.85432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 32134 1727204427.85464: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 32134 1727204427.85560: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 32134 1727204427.85598: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646353b860> <<< 32134 1727204427.85636: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 32134 1727204427.85665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646353bef0> import '_collections' # <<< 32134 1727204427.85732: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646351bb60> <<< 32134 1727204427.85791: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463519280> <<< 32134 1727204427.85946: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463501040> <<< 32134 1727204427.85983: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 32134 1727204427.86018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 32134 1727204427.86031: stdout chunk (state=3): >>>import '_sre' # <<< 32134 1727204427.86095: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 32134 1727204427.86123: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 32134 1727204427.86168: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646355f740> <<< 32134 1727204427.86230: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646355e360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646351a270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463502f30> <<< 32134 1727204427.86321: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 32134 1727204427.86339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463590740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64635002c0> <<< 32134 1727204427.86406: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6463590bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463590aa0> <<< 32134 1727204427.86452: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204427.86475: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6463590e30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64634fede0> <<< 32134 1727204427.86535: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 32134 1727204427.86602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 32134 1727204427.86605: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463591520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64635911f0> <<< 32134 1727204427.86666: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 32134 1727204427.86696: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463592420> <<< 32134 1727204427.86726: stdout chunk (state=3): >>>import 'importlib.util' # <<< 32134 1727204427.86730: stdout chunk (state=3): >>>import 'runpy' # <<< 32134 1727204427.86809: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 32134 1727204427.86837: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 32134 1727204427.86865: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64635ac650> import 'errno' # <<< 32134 1727204427.86918: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204427.86921: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64635add60> <<< 32134 1727204427.86951: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 32134 1727204427.86996: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 32134 1727204427.87011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64635aec60> <<< 32134 1727204427.87066: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64635af2c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64635ae1b0> <<< 32134 1727204427.87117: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 32134 1727204427.87155: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204427.87180: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64635afd40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64635af470> <<< 32134 1727204427.87250: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463592480> <<< 32134 1727204427.87322: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 32134 1727204427.87346: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 32134 1727204427.87370: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 32134 1727204427.87438: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64632a3cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 32134 1727204427.87476: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64632cc7a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64632cc500> <<< 32134 1727204427.87556: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64632cc7d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64632cc9b0> <<< 32134 1727204427.87577: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64632a1e50> <<< 32134 1727204427.87609: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 32134 1727204427.87842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 32134 1727204427.87845: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64632ce000> <<< 32134 1727204427.87935: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64632ccc80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463592b70> <<< 32134 1727204427.87938: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 32134 1727204427.88036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204427.88111: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 32134 1727204427.88147: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64632fa3c0> <<< 32134 1727204427.88232: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 32134 1727204427.88261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204427.88299: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 32134 1727204427.88307: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 32134 1727204427.88371: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463312510> <<< 32134 1727204427.88403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 32134 1727204427.88493: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 32134 1727204427.88557: stdout chunk (state=3): >>>import 'ntpath' # <<< 32134 1727204427.88616: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204427.88800: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646334b2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 32134 1727204427.89109: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463371a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646334b410> <<< 32134 1727204427.89134: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64633131a0> <<< 32134 1727204427.89185: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 32134 1727204427.89237: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646318c410> <<< 32134 1727204427.89241: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463311550> <<< 32134 1727204427.89271: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64632cef60> <<< 32134 1727204427.89523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 32134 1727204427.89560: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f646318c6e0> <<< 32134 1727204427.89881: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_261mcnkr/ansible_setup_payload.zip' <<< 32134 1727204427.89895: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204427.90163: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204427.90221: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 32134 1727204427.90244: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 32134 1727204427.90305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 32134 1727204427.90441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 32134 1727204427.90598: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64631fa1e0> import '_typing' # <<< 32134 1727204427.90822: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64631d10d0> <<< 32134 1727204427.90850: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64631d0230> # zipimport: zlib available <<< 32134 1727204427.90910: stdout chunk (state=3): >>>import 'ansible' # <<< 32134 1727204427.90968: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204427.90973: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204427.91007: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204427.91038: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 32134 1727204427.91051: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204427.93644: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204427.95955: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 32134 1727204427.96214: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64631d35f0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6463229bb0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463229970> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463229280><<< 32134 1727204427.96217: stdout chunk (state=3): >>> <<< 32134 1727204427.96248: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 32134 1727204427.96269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 32134 1727204427.96450: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64632299d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64631fae70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f646322a8d0> <<< 32134 1727204427.96544: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204427.96548: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f646322ab10> <<< 32134 1727204427.96551: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 32134 1727204427.96629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 32134 1727204427.96672: stdout chunk (state=3): >>>import '_locale' # <<< 32134 1727204427.96736: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646322b020> <<< 32134 1727204427.96757: stdout chunk (state=3): >>>import 'pwd' # <<< 32134 1727204427.96796: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 32134 1727204427.96845: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 32134 1727204427.96908: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463090dd0> <<< 32134 1727204427.96951: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204427.96964: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204427.97016: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630929f0> <<< 32134 1727204427.97101: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463093380> <<< 32134 1727204427.97116: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 32134 1727204427.97168: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 32134 1727204427.97204: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630942c0> <<< 32134 1727204427.97234: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 32134 1727204427.97304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 32134 1727204427.97330: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 32134 1727204427.97352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 32134 1727204427.97448: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463096fc0> <<< 32134 1727204427.97943: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630970e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463095280> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646309af60> <<< 32134 1727204427.97968: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463099a30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463099790> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 32134 1727204427.98096: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646309be00> <<< 32134 1727204427.98148: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463095790> <<< 32134 1727204427.98200: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204427.98204: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204427.98222: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630df140> <<< 32134 1727204427.98263: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 32134 1727204427.98284: stdout chunk (state=3): >>> import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630df2c0> <<< 32134 1727204427.98315: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 32134 1727204427.98349: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 32134 1727204427.98404: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 32134 1727204427.98408: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 32134 1727204427.98596: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630e4e90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630e4c50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 32134 1727204427.98672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 32134 1727204427.98762: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204427.98787: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630e73e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630e5580> <<< 32134 1727204427.98823: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 32134 1727204427.98908: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204427.98944: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 32134 1727204427.98980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 32134 1727204427.98996: stdout chunk (state=3): >>>import '_string' # <<< 32134 1727204427.99081: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630eeb40> <<< 32134 1727204427.99356: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630e74d0> <<< 32134 1727204427.99480: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204427.99506: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630ef920> <<< 32134 1727204427.99697: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630efb90> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630efec0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630df5c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 32134 1727204427.99748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 32134 1727204427.99791: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204427.99859: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204427.99863: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630f35c0> <<< 32134 1727204428.00221: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.00247: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.00251: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630f4680> <<< 32134 1727204428.00264: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630f1d30> <<< 32134 1727204428.00333: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.00336: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630f30b0> <<< 32134 1727204428.00371: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630f1910> <<< 32134 1727204428.00376: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.00425: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.00429: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 32134 1727204428.00692: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32134 1727204428.00793: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.00845: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.00849: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 32134 1727204428.00870: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.00908: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.00921: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 32134 1727204428.00952: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.01200: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.01433: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.02655: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.03881: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 32134 1727204428.03928: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 32134 1727204428.03932: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 32134 1727204428.03961: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 32134 1727204428.04016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204428.04115: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.04119: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.04142: stdout chunk (state=3): >>>import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6462f7c860> <<< 32134 1727204428.04298: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 32134 1727204428.04323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 32134 1727204428.04366: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f7d700> <<< 32134 1727204428.04383: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630f0230> <<< 32134 1727204428.04499: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 32134 1727204428.04522: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 32134 1727204428.04552: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.04852: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.05163: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 32134 1727204428.05293: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f7ddc0> # zipimport: zlib available <<< 32134 1727204428.06222: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.07210: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.07358: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.07507: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 32134 1727204428.07535: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.07607: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.07674: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 32134 1727204428.07707: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.07839: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.08038: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 32134 1727204428.08199: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 32134 1727204428.08242: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 32134 1727204428.08272: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.08799: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.09327: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 32134 1727204428.09448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 32134 1727204428.09485: stdout chunk (state=3): >>>import '_ast' # <<< 32134 1727204428.09655: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f7e5d0> <<< 32134 1727204428.09685: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.09820: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.09932: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 32134 1727204428.10021: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 32134 1727204428.10043: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 32134 1727204428.10404: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6462f86060> <<< 32134 1727204428.10599: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6462f869f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f7f440> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 32134 1727204428.10663: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.10729: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.10829: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.10954: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 32134 1727204428.11037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204428.11192: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.11210: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6462f858b0> <<< 32134 1727204428.11284: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f86b40> <<< 32134 1727204428.11350: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 32134 1727204428.11368: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 32134 1727204428.11378: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.11482: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.11595: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.11643: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.11730: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 32134 1727204428.11733: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204428.11798: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 32134 1727204428.11829: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 32134 1727204428.11934: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 32134 1727204428.12011: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 32134 1727204428.12025: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 32134 1727204428.12196: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646301ad20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f90a70> <<< 32134 1727204428.12350: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f8eba0> <<< 32134 1727204428.12379: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f8e9f0> # destroy ansible.module_utils.distro <<< 32134 1727204428.12405: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # <<< 32134 1727204428.12421: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.12470: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.12517: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 32134 1727204428.12599: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 32134 1727204428.12640: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 32134 1727204428.12672: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.12715: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.12735: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 32134 1727204428.12759: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.12877: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.13095: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 32134 1727204428.13135: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.13207: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.13273: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.13354: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 32134 1727204428.13373: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.13526: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.13679: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.13718: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.13895: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 32134 1727204428.14127: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.14445: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.14517: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.14604: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204428.14645: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 32134 1727204428.14685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 32134 1727204428.14722: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 32134 1727204428.14759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 32134 1727204428.14824: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646301da90> <<< 32134 1727204428.14844: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 32134 1727204428.15015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 32134 1727204428.15060: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462548350> <<< 32134 1727204428.15073: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.15110: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.15127: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64625486b0> <<< 32134 1727204428.15203: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462ffd3d0> <<< 32134 1727204428.15239: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462ffc620> <<< 32134 1727204428.15288: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646301c1a0> <<< 32134 1727204428.15321: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646301fc20> <<< 32134 1727204428.15350: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 32134 1727204428.15453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 32134 1727204428.15498: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 32134 1727204428.15703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f646254b650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646254af00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f646254b0e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646254a330> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 32134 1727204428.15855: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 32134 1727204428.15885: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646254b830> <<< 32134 1727204428.15916: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 32134 1727204428.15972: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 32134 1727204428.16021: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.16044: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64625b2330> <<< 32134 1727204428.16081: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64625b0350> <<< 32134 1727204428.16145: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646301fe00> <<< 32134 1727204428.16150: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 32134 1727204428.16201: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 32134 1727204428.16217: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.16238: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.16267: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 32134 1727204428.16296: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.16382: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.16479: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 32134 1727204428.16527: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.16605: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.16808: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available <<< 32134 1727204428.16822: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 32134 1727204428.16852: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.16929: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.17017: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 32134 1727204428.17032: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.17104: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.17168: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 32134 1727204428.17202: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.17314: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.17408: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.17515: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.17797: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 32134 1727204428.18566: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.19442: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 32134 1727204428.19446: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.19537: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.19635: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.19673: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.19736: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 32134 1727204428.19762: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 32134 1727204428.19784: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.19832: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 32134 1727204428.19842: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.19929: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.20017: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 32134 1727204428.20039: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.20087: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.20133: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 32134 1727204428.20137: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.20177: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.20234: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 32134 1727204428.20246: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.20357: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.20517: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 32134 1727204428.20520: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 32134 1727204428.20576: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64625b2660> <<< 32134 1727204428.20579: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 32134 1727204428.20796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 32134 1727204428.20832: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64625b3290> import 'ansible.module_utils.facts.system.local' # <<< 32134 1727204428.20851: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.20955: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.21071: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 32134 1727204428.21094: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.21231: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.21408: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 32134 1727204428.21415: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.21515: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.21644: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 32134 1727204428.21670: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.21709: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.21796: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 32134 1727204428.21860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 32134 1727204428.21977: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.22083: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64625e6720> <<< 32134 1727204428.22429: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64625cf170> import 'ansible.module_utils.facts.system.python' # <<< 32134 1727204428.22459: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.22536: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.22627: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 32134 1727204428.22810: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.22908: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.23162: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.23372: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 32134 1727204428.23429: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.23508: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 32134 1727204428.23585: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.23722: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f646239e060> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646239dd60> import 'ansible.module_utils.facts.system.user' # <<< 32134 1727204428.23726: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.23760: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 32134 1727204428.23823: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.23888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 32134 1727204428.24176: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.24462: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 32134 1727204428.24630: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.24858: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32134 1727204428.24922: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 32134 1727204428.24943: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.24971: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.25014: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.25252: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.25502: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 32134 1727204428.25593: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.25747: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.25948: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 32134 1727204428.25967: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.26017: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.26069: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.27100: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.28062: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 32134 1727204428.28091: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.28272: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.28453: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 32134 1727204428.28643: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32134 1727204428.28826: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 32134 1727204428.29098: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.29376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 32134 1727204428.29395: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.29422: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 32134 1727204428.29432: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.29499: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.29566: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 32134 1727204428.29576: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.29735: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.29905: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.30287: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.30678: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 32134 1727204428.30705: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.30742: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.30809: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 32134 1727204428.30850: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.30888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 32134 1727204428.31010: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.31029: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.31174: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 32134 1727204428.31186: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.31231: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 32134 1727204428.31323: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.31421: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 32134 1727204428.31606: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # <<< 32134 1727204428.31628: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.32105: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.32638: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 32134 1727204428.32707: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.32793: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 32134 1727204428.32806: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.32857: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.32911: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 32134 1727204428.32925: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.32966: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.33014: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 32134 1727204428.33072: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32134 1727204428.33123: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 32134 1727204428.33133: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.33274: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.33408: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 32134 1727204428.33429: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.33449: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 32134 1727204428.33533: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32134 1727204428.33605: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 32134 1727204428.33613: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.33638: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.33667: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.33742: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.33825: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.33943: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.34063: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 32134 1727204428.34070: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 32134 1727204428.34164: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32134 1727204428.34238: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 32134 1727204428.34256: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.34634: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.35012: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 32134 1727204428.35016: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.35110: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.35163: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 32134 1727204428.35166: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.35246: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.35301: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 32134 1727204428.35386: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.35453: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.35596: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 32134 1727204428.35617: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.35772: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.35930: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 32134 1727204428.35936: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 32134 1727204428.36044: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.37438: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 32134 1727204428.37478: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 32134 1727204428.37508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 32134 1727204428.37552: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.37558: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64623c7650> <<< 32134 1727204428.37662: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64623c6330> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64623c5670> <<< 32134 1727204428.38304: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin<<< 32134 1727204428.38322: stdout chunk (state=3): >>>:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "28", "epoch": "1727204428", "epoch_int": "1727204428", "date": "2024-09-24", "time": "15:00:28", "iso8601_micro": "2024-09-24T19:00:28.380062Z", "iso8601": "2024-09-24T19:00:28Z", "iso8601_basic": "20240924T150028380062", "iso8601_basic_short": "20240924T150028", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 32134 1727204428.39351: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 32134 1727204428.39371: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 <<< 32134 1727204428.39416: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external <<< 32134 1727204428.39497: stdout chunk (state=3): >>># cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy <<< 32134 1727204428.39621: stdout chunk (state=3): >>># cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy <<< 32134 1727204428.39658: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.<<< 32134 1727204428.39671: stdout chunk (state=3): >>>loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible<<< 32134 1727204428.39690: stdout chunk (state=3): >>>.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 32134 1727204428.40200: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 32134 1727204428.40247: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma <<< 32134 1727204428.40254: stdout chunk (state=3): >>># destroy zipfile._path <<< 32134 1727204428.40276: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob <<< 32134 1727204428.40327: stdout chunk (state=3): >>># destroy ipaddress <<< 32134 1727204428.40347: stdout chunk (state=3): >>># destroy ntpath <<< 32134 1727204428.40363: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 32134 1727204428.40387: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 32134 1727204428.40396: stdout chunk (state=3): >>># destroy grp <<< 32134 1727204428.40408: stdout chunk (state=3): >>># destroy encodings <<< 32134 1727204428.40436: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select <<< 32134 1727204428.40456: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 32134 1727204428.40532: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 32134 1727204428.40601: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 32134 1727204428.40609: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 32134 1727204428.40646: stdout chunk (state=3): >>># destroy _pickle # destroy queue <<< 32134 1727204428.40682: stdout chunk (state=3): >>># destroy _heapq # destroy _queue <<< 32134 1727204428.40686: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy unicodedata <<< 32134 1727204428.40694: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 32134 1727204428.40725: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 32134 1727204428.40728: stdout chunk (state=3): >>># destroy subprocess <<< 32134 1727204428.40742: stdout chunk (state=3): >>># destroy base64 <<< 32134 1727204428.40907: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep <<< 32134 1727204428.40932: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 32134 1727204428.40950: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 32134 1727204428.40965: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 32134 1727204428.40981: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 32134 1727204428.41010: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 32134 1727204428.41027: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 32134 1727204428.41049: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 32134 1727204428.41059: stdout chunk (state=3): >>># cleanup[3] wiping re._parser <<< 32134 1727204428.41101: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 32134 1727204428.41105: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools <<< 32134 1727204428.41108: stdout chunk (state=3): >>># cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 32134 1727204428.41111: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 32134 1727204428.41136: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 32134 1727204428.41155: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 32134 1727204428.41175: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 32134 1727204428.41286: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 32134 1727204428.41427: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 32134 1727204428.41461: stdout chunk (state=3): >>># destroy _collections <<< 32134 1727204428.41496: stdout chunk (state=3): >>># destroy platform <<< 32134 1727204428.41499: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 32134 1727204428.41518: stdout chunk (state=3): >>># destroy tokenize <<< 32134 1727204428.41543: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 32134 1727204428.41552: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 32134 1727204428.41590: stdout chunk (state=3): >>># destroy _typing <<< 32134 1727204428.41601: stdout chunk (state=3): >>># destroy _tokenize <<< 32134 1727204428.41619: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 32134 1727204428.41637: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 32134 1727204428.41680: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 32134 1727204428.41803: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna <<< 32134 1727204428.41817: stdout chunk (state=3): >>># destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 32134 1727204428.41858: stdout chunk (state=3): >>># destroy time # destroy _random <<< 32134 1727204428.41865: stdout chunk (state=3): >>># destroy _weakref <<< 32134 1727204428.41907: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re <<< 32134 1727204428.41919: stdout chunk (state=3): >>># destroy itertools <<< 32134 1727204428.41938: stdout chunk (state=3): >>># destroy _abc # destroy posix <<< 32134 1727204428.41946: stdout chunk (state=3): >>># destroy _functools # destroy builtins # destroy _thread <<< 32134 1727204428.41993: stdout chunk (state=3): >>># clear sys.audit hooks <<< 32134 1727204428.42616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204428.42680: stderr chunk (state=3): >>><<< 32134 1727204428.42683: stdout chunk (state=3): >>><<< 32134 1727204428.42798: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64636d44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64636a3ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64636d6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64634c50a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64634c5fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463503e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463503f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646353b860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646353bef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646351bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463519280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463501040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646355f740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646355e360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646351a270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463502f30> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463590740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64635002c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6463590bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463590aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6463590e30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64634fede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463591520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64635911f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463592420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64635ac650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64635add60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64635aec60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64635af2c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64635ae1b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64635afd40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64635af470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463592480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64632a3cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64632cc7a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64632cc500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64632cc7d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64632cc9b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64632a1e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64632ce000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64632ccc80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463592b70> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64632fa3c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463312510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646334b2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463371a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646334b410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64633131a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646318c410> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463311550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64632cef60> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f646318c6e0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_261mcnkr/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64631fa1e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64631d10d0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64631d0230> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64631d35f0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6463229bb0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463229970> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463229280> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64632299d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64631fae70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f646322a8d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f646322ab10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646322b020> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463090dd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630929f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463093380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630942c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463096fc0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630970e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463095280> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646309af60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463099a30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463099790> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646309be00> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6463095790> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630df140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630df2c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630e4e90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630e4c50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630e73e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630e5580> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630eeb40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630e74d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630ef920> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630efb90> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630efec0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630df5c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630f35c0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630f4680> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630f1d30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64630f30b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630f1910> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6462f7c860> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f7d700> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64630f0230> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f7ddc0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f7e5d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6462f86060> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6462f869f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f7f440> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6462f858b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f86b40> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646301ad20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f90a70> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f8eba0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462f8e9f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646301da90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462548350> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64625486b0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462ffd3d0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6462ffc620> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646301c1a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646301fc20> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f646254b650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646254af00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f646254b0e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646254a330> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646254b830> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64625b2330> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64625b0350> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646301fe00> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64625b2660> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64625b3290> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64625e6720> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64625cf170> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f646239e060> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f646239dd60> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64623c7650> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64623c6330> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64623c5670> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "28", "epoch": "1727204428", "epoch_int": "1727204428", "date": "2024-09-24", "time": "15:00:28", "iso8601_micro": "2024-09-24T19:00:28.380062Z", "iso8601": "2024-09-24T19:00:28Z", "iso8601_basic": "20240924T150028380062", "iso8601_basic_short": "20240924T150028", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 32134 1727204428.43694: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204427.6624367-32267-273482943208094/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204428.43698: _low_level_execute_command(): starting 32134 1727204428.43700: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204427.6624367-32267-273482943208094/ > /dev/null 2>&1 && sleep 0' 32134 1727204428.43720: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204428.43724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204428.43757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204428.43761: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204428.43770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204428.43829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204428.43835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204428.43836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204428.43885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204428.46687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204428.46738: stderr chunk (state=3): >>><<< 32134 1727204428.46742: stdout chunk (state=3): >>><<< 32134 1727204428.46758: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204428.46769: handler run complete 32134 1727204428.46810: variable 'ansible_facts' from source: unknown 32134 1727204428.46873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204428.46976: variable 'ansible_facts' from source: unknown 32134 1727204428.47023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204428.47072: attempt loop complete, returning result 32134 1727204428.47076: _execute() done 32134 1727204428.47078: dumping result to json 32134 1727204428.47092: done dumping result, returning 32134 1727204428.47102: done running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [12b410aa-8751-753f-5162-0000000000b4] 32134 1727204428.47105: sending task result for task 12b410aa-8751-753f-5162-0000000000b4 32134 1727204428.47259: done sending task result for task 12b410aa-8751-753f-5162-0000000000b4 32134 1727204428.47263: WORKER PROCESS EXITING ok: [managed-node2] 32134 1727204428.47431: no more pending results, returning what we have 32134 1727204428.47434: results queue empty 32134 1727204428.47435: checking for any_errors_fatal 32134 1727204428.47437: done checking for any_errors_fatal 32134 1727204428.47437: checking for max_fail_percentage 32134 1727204428.47439: done checking for max_fail_percentage 32134 1727204428.47440: checking to see if all hosts have failed and the running result is not ok 32134 1727204428.47441: done checking to see if all hosts have failed 32134 1727204428.47442: getting the remaining hosts for this loop 32134 1727204428.47443: done getting the remaining hosts for this loop 32134 1727204428.47447: getting the next task for host managed-node2 32134 1727204428.47455: done getting next task for host managed-node2 32134 1727204428.47458: ^ task is: TASK: Check if system is ostree 32134 1727204428.47460: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204428.47463: getting variables 32134 1727204428.47465: in VariableManager get_vars() 32134 1727204428.47503: Calling all_inventory to load vars for managed-node2 32134 1727204428.47506: Calling groups_inventory to load vars for managed-node2 32134 1727204428.47509: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204428.47519: Calling all_plugins_play to load vars for managed-node2 32134 1727204428.47522: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204428.47524: Calling groups_plugins_play to load vars for managed-node2 32134 1727204428.47687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204428.47867: done with get_vars() 32134 1727204428.47875: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 15:00:28 -0400 (0:00:00.935) 0:00:02.883 ***** 32134 1727204428.47957: entering _queue_task() for managed-node2/stat 32134 1727204428.48169: worker is 1 (out of 1 available) 32134 1727204428.48186: exiting _queue_task() for managed-node2/stat 32134 1727204428.48199: done queuing things up, now waiting for results queue to drain 32134 1727204428.48201: waiting for pending results... 32134 1727204428.48350: running TaskExecutor() for managed-node2/TASK: Check if system is ostree 32134 1727204428.48425: in run() - task 12b410aa-8751-753f-5162-0000000000b6 32134 1727204428.48444: variable 'ansible_search_path' from source: unknown 32134 1727204428.48448: variable 'ansible_search_path' from source: unknown 32134 1727204428.48474: calling self._execute() 32134 1727204428.48539: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204428.48543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204428.48558: variable 'omit' from source: magic vars 32134 1727204428.48943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204428.49160: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204428.49199: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204428.49232: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204428.49263: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204428.49339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204428.49361: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204428.49383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204428.49407: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204428.49510: Evaluated conditional (not __network_is_ostree is defined): True 32134 1727204428.49516: variable 'omit' from source: magic vars 32134 1727204428.49552: variable 'omit' from source: magic vars 32134 1727204428.49580: variable 'omit' from source: magic vars 32134 1727204428.49605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204428.49629: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204428.49645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204428.49664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204428.49674: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204428.49701: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204428.49705: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204428.49709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204428.49793: Set connection var ansible_timeout to 10 32134 1727204428.49806: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204428.49810: Set connection var ansible_connection to ssh 32134 1727204428.49815: Set connection var ansible_shell_type to sh 32134 1727204428.49820: Set connection var ansible_shell_executable to /bin/sh 32134 1727204428.49826: Set connection var ansible_pipelining to False 32134 1727204428.49845: variable 'ansible_shell_executable' from source: unknown 32134 1727204428.49848: variable 'ansible_connection' from source: unknown 32134 1727204428.49851: variable 'ansible_module_compression' from source: unknown 32134 1727204428.49855: variable 'ansible_shell_type' from source: unknown 32134 1727204428.49859: variable 'ansible_shell_executable' from source: unknown 32134 1727204428.49861: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204428.49867: variable 'ansible_pipelining' from source: unknown 32134 1727204428.49871: variable 'ansible_timeout' from source: unknown 32134 1727204428.49881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204428.49998: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204428.50006: variable 'omit' from source: magic vars 32134 1727204428.50014: starting attempt loop 32134 1727204428.50018: running the handler 32134 1727204428.50028: _low_level_execute_command(): starting 32134 1727204428.50036: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204428.50577: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204428.50581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204428.50584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204428.50586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204428.50643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204428.50647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204428.50700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204428.53300: stdout chunk (state=3): >>>/root <<< 32134 1727204428.53459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204428.53513: stderr chunk (state=3): >>><<< 32134 1727204428.53519: stdout chunk (state=3): >>><<< 32134 1727204428.53542: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204428.53556: _low_level_execute_command(): starting 32134 1727204428.53562: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204428.5354059-32295-188807417674804 `" && echo ansible-tmp-1727204428.5354059-32295-188807417674804="` echo /root/.ansible/tmp/ansible-tmp-1727204428.5354059-32295-188807417674804 `" ) && sleep 0' 32134 1727204428.53994: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204428.54038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204428.54042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204428.54045: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204428.54047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204428.54092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204428.54096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204428.54144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204428.56986: stdout chunk (state=3): >>>ansible-tmp-1727204428.5354059-32295-188807417674804=/root/.ansible/tmp/ansible-tmp-1727204428.5354059-32295-188807417674804 <<< 32134 1727204428.57183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204428.57234: stderr chunk (state=3): >>><<< 32134 1727204428.57238: stdout chunk (state=3): >>><<< 32134 1727204428.57253: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204428.5354059-32295-188807417674804=/root/.ansible/tmp/ansible-tmp-1727204428.5354059-32295-188807417674804 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204428.57306: variable 'ansible_module_compression' from source: unknown 32134 1727204428.57355: ANSIBALLZ: Using lock for stat 32134 1727204428.57358: ANSIBALLZ: Acquiring lock 32134 1727204428.57361: ANSIBALLZ: Lock acquired: 140589353833952 32134 1727204428.57364: ANSIBALLZ: Creating module 32134 1727204428.69286: ANSIBALLZ: Writing module into payload 32134 1727204428.69373: ANSIBALLZ: Writing module 32134 1727204428.69393: ANSIBALLZ: Renaming module 32134 1727204428.69399: ANSIBALLZ: Done creating module 32134 1727204428.69415: variable 'ansible_facts' from source: unknown 32134 1727204428.69468: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204428.5354059-32295-188807417674804/AnsiballZ_stat.py 32134 1727204428.69599: Sending initial data 32134 1727204428.69603: Sent initial data (153 bytes) 32134 1727204428.70109: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204428.70112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204428.70115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204428.70132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204428.70184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204428.70188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204428.70195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204428.70244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204428.72702: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 32134 1727204428.72706: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204428.72762: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204428.72807: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpp3zj2r7b /root/.ansible/tmp/ansible-tmp-1727204428.5354059-32295-188807417674804/AnsiballZ_stat.py <<< 32134 1727204428.72816: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204428.5354059-32295-188807417674804/AnsiballZ_stat.py" <<< 32134 1727204428.72851: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpp3zj2r7b" to remote "/root/.ansible/tmp/ansible-tmp-1727204428.5354059-32295-188807417674804/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204428.5354059-32295-188807417674804/AnsiballZ_stat.py" <<< 32134 1727204428.73669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204428.73737: stderr chunk (state=3): >>><<< 32134 1727204428.73741: stdout chunk (state=3): >>><<< 32134 1727204428.73759: done transferring module to remote 32134 1727204428.73775: _low_level_execute_command(): starting 32134 1727204428.73781: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204428.5354059-32295-188807417674804/ /root/.ansible/tmp/ansible-tmp-1727204428.5354059-32295-188807417674804/AnsiballZ_stat.py && sleep 0' 32134 1727204428.74270: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204428.74274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204428.74276: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204428.74278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204428.74333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204428.74336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204428.74340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204428.74387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204428.77034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204428.77091: stderr chunk (state=3): >>><<< 32134 1727204428.77095: stdout chunk (state=3): >>><<< 32134 1727204428.77110: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204428.77116: _low_level_execute_command(): starting 32134 1727204428.77119: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204428.5354059-32295-188807417674804/AnsiballZ_stat.py && sleep 0' 32134 1727204428.77593: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204428.77598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204428.77600: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204428.77603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204428.77605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204428.77661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204428.77665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204428.77723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204428.81063: stdout chunk (state=3): >>>import _frozen_importlib # frozen<<< 32134 1727204428.81066: stdout chunk (state=3): >>> <<< 32134 1727204428.81142: stdout chunk (state=3): >>>import _imp # builtin<<< 32134 1727204428.81149: stdout chunk (state=3): >>> <<< 32134 1727204428.81200: stdout chunk (state=3): >>>import '_thread' # <<< 32134 1727204428.81206: stdout chunk (state=3): >>> <<< 32134 1727204428.81229: stdout chunk (state=3): >>>import '_warnings' # <<< 32134 1727204428.81234: stdout chunk (state=3): >>> <<< 32134 1727204428.81254: stdout chunk (state=3): >>>import '_weakref' # <<< 32134 1727204428.81381: stdout chunk (state=3): >>> import '_io' # <<< 32134 1727204428.81400: stdout chunk (state=3): >>> <<< 32134 1727204428.81418: stdout chunk (state=3): >>>import 'marshal' # <<< 32134 1727204428.81425: stdout chunk (state=3): >>> <<< 32134 1727204428.81503: stdout chunk (state=3): >>>import 'posix' # <<< 32134 1727204428.81515: stdout chunk (state=3): >>> <<< 32134 1727204428.81569: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 32134 1727204428.81575: stdout chunk (state=3): >>> <<< 32134 1727204428.81608: stdout chunk (state=3): >>># installing zipimport hook<<< 32134 1727204428.81610: stdout chunk (state=3): >>> <<< 32134 1727204428.81647: stdout chunk (state=3): >>>import 'time' # <<< 32134 1727204428.81653: stdout chunk (state=3): >>> <<< 32134 1727204428.81674: stdout chunk (state=3): >>>import 'zipimport' # <<< 32134 1727204428.81769: stdout chunk (state=3): >>> # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py<<< 32134 1727204428.81774: stdout chunk (state=3): >>> <<< 32134 1727204428.81802: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204428.81851: stdout chunk (state=3): >>>import '_codecs' # <<< 32134 1727204428.81857: stdout chunk (state=3): >>> <<< 32134 1727204428.81909: stdout chunk (state=3): >>>import 'codecs' # <<< 32134 1727204428.81918: stdout chunk (state=3): >>> <<< 32134 1727204428.82012: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc'<<< 32134 1727204428.82094: stdout chunk (state=3): >>> import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed320c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed31dbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed320ea20> <<< 32134 1727204428.82163: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 32134 1727204428.82173: stdout chunk (state=3): >>> <<< 32134 1727204428.82190: stdout chunk (state=3): >>>import 'abc' # <<< 32134 1727204428.82196: stdout chunk (state=3): >>> <<< 32134 1727204428.82270: stdout chunk (state=3): >>>import 'io' # import '_stat' # <<< 32134 1727204428.82273: stdout chunk (state=3): >>> <<< 32134 1727204428.82297: stdout chunk (state=3): >>>import 'stat' # <<< 32134 1727204428.82303: stdout chunk (state=3): >>> <<< 32134 1727204428.82447: stdout chunk (state=3): >>>import '_collections_abc' # <<< 32134 1727204428.82499: stdout chunk (state=3): >>> import 'genericpath' # <<< 32134 1727204428.82516: stdout chunk (state=3): >>> <<< 32134 1727204428.82519: stdout chunk (state=3): >>>import 'posixpath' # <<< 32134 1727204428.82532: stdout chunk (state=3): >>> <<< 32134 1727204428.82586: stdout chunk (state=3): >>>import 'os' # <<< 32134 1727204428.82630: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages<<< 32134 1727204428.82648: stdout chunk (state=3): >>> Processing global site-packages<<< 32134 1727204428.82670: stdout chunk (state=3): >>> Adding directory: '/usr/local/lib/python3.12/site-packages'<<< 32134 1727204428.82697: stdout chunk (state=3): >>> Adding directory: '/usr/lib64/python3.12/site-packages'<<< 32134 1727204428.82717: stdout chunk (state=3): >>> Adding directory: '/usr/lib/python3.12/site-packages'<<< 32134 1727204428.82752: stdout chunk (state=3): >>> Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth'<<< 32134 1727204428.82758: stdout chunk (state=3): >>> <<< 32134 1727204428.82815: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 32134 1727204428.82859: stdout chunk (state=3): >>> import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30210a0><<< 32134 1727204428.82864: stdout chunk (state=3): >>> <<< 32134 1727204428.82968: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py<<< 32134 1727204428.82973: stdout chunk (state=3): >>> <<< 32134 1727204428.83013: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed3021fd0><<< 32134 1727204428.83060: stdout chunk (state=3): >>> import 'site' # <<< 32134 1727204428.83066: stdout chunk (state=3): >>> <<< 32134 1727204428.83116: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux<<< 32134 1727204428.83133: stdout chunk (state=3): >>> Type "help", "copyright", "credits" or "license" for more information.<<< 32134 1727204428.83144: stdout chunk (state=3): >>> <<< 32134 1727204428.83565: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc'<<< 32134 1727204428.83576: stdout chunk (state=3): >>> <<< 32134 1727204428.83604: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py<<< 32134 1727204428.83629: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc'<<< 32134 1727204428.83665: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py<<< 32134 1727204428.83671: stdout chunk (state=3): >>> <<< 32134 1727204428.83735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc'<<< 32134 1727204428.83769: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py<<< 32134 1727204428.83776: stdout chunk (state=3): >>> <<< 32134 1727204428.83818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc'<<< 32134 1727204428.83850: stdout chunk (state=3): >>> import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed305fe90><<< 32134 1727204428.83856: stdout chunk (state=3): >>> <<< 32134 1727204428.83920: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 32134 1727204428.83925: stdout chunk (state=3): >>> <<< 32134 1727204428.83960: stdout chunk (state=3): >>>import '_operator' # <<< 32134 1727204428.83985: stdout chunk (state=3): >>> import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed305ff50><<< 32134 1727204428.84025: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py<<< 32134 1727204428.84031: stdout chunk (state=3): >>> <<< 32134 1727204428.84076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 32134 1727204428.84193: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204428.84237: stdout chunk (state=3): >>>import 'itertools' # <<< 32134 1727204428.84243: stdout chunk (state=3): >>> <<< 32134 1727204428.84275: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 32134 1727204428.84300: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc'<<< 32134 1727204428.84314: stdout chunk (state=3): >>> <<< 32134 1727204428.84324: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed3097890><<< 32134 1727204428.84357: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py<<< 32134 1727204428.84363: stdout chunk (state=3): >>> <<< 32134 1727204428.84382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc'<<< 32134 1727204428.84405: stdout chunk (state=3): >>> import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed3097f20><<< 32134 1727204428.84442: stdout chunk (state=3): >>> import '_collections' # <<< 32134 1727204428.84447: stdout chunk (state=3): >>> <<< 32134 1727204428.84520: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed3077b60><<< 32134 1727204428.84550: stdout chunk (state=3): >>> import '_functools' # <<< 32134 1727204428.84608: stdout chunk (state=3): >>> import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed3075280><<< 32134 1727204428.84614: stdout chunk (state=3): >>> <<< 32134 1727204428.84781: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed305d040><<< 32134 1727204428.84790: stdout chunk (state=3): >>> <<< 32134 1727204428.84825: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py<<< 32134 1727204428.84866: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 32134 1727204428.84899: stdout chunk (state=3): >>>import '_sre' # <<< 32134 1727204428.84940: stdout chunk (state=3): >>> # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py<<< 32134 1727204428.84948: stdout chunk (state=3): >>> <<< 32134 1727204428.84995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc'<<< 32134 1727204428.85032: stdout chunk (state=3): >>> # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py<<< 32134 1727204428.85037: stdout chunk (state=3): >>> <<< 32134 1727204428.85124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30bb7d0> <<< 32134 1727204428.85159: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30ba3f0><<< 32134 1727204428.85162: stdout chunk (state=3): >>> <<< 32134 1727204428.85207: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py<<< 32134 1727204428.85217: stdout chunk (state=3): >>> <<< 32134 1727204428.85229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed3076270><<< 32134 1727204428.85316: stdout chunk (state=3): >>> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30b8c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 32134 1727204428.85343: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 32134 1727204428.85370: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30ec770> <<< 32134 1727204428.85381: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed305c2c0><<< 32134 1727204428.85390: stdout chunk (state=3): >>> <<< 32134 1727204428.85418: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py<<< 32134 1727204428.85425: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc'<<< 32134 1727204428.85470: stdout chunk (state=3): >>> # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.85475: stdout chunk (state=3): >>> <<< 32134 1727204428.85503: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.85516: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed30ecc20><<< 32134 1727204428.85584: stdout chunk (state=3): >>> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30ecad0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.85592: stdout chunk (state=3): >>> <<< 32134 1727204428.85616: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.85619: stdout chunk (state=3): >>> import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed30ecec0><<< 32134 1727204428.85644: stdout chunk (state=3): >>> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed305ade0><<< 32134 1727204428.85686: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py<<< 32134 1727204428.85693: stdout chunk (state=3): >>> <<< 32134 1727204428.85893: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30ed5b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30ed280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30ee4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 32134 1727204428.85946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc'<<< 32134 1727204428.85985: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py<<< 32134 1727204428.85997: stdout chunk (state=3): >>> <<< 32134 1727204428.86011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc'<<< 32134 1727204428.86018: stdout chunk (state=3): >>> import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed31086e0><<< 32134 1727204428.86048: stdout chunk (state=3): >>> import 'errno' # <<< 32134 1727204428.86053: stdout chunk (state=3): >>> <<< 32134 1727204428.86109: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.86118: stdout chunk (state=3): >>> import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed3109e20><<< 32134 1727204428.86151: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py<<< 32134 1727204428.86159: stdout chunk (state=3): >>> <<< 32134 1727204428.86186: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc'<<< 32134 1727204428.86221: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py<<< 32134 1727204428.86227: stdout chunk (state=3): >>> <<< 32134 1727204428.86267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed310acf0><<< 32134 1727204428.86270: stdout chunk (state=3): >>> <<< 32134 1727204428.86328: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.86357: stdout chunk (state=3): >>> import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed310b350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed310a270><<< 32134 1727204428.86363: stdout chunk (state=3): >>> <<< 32134 1727204428.86417: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc'<<< 32134 1727204428.86422: stdout chunk (state=3): >>> <<< 32134 1727204428.86478: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.86515: stdout chunk (state=3): >>> import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed310bdd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed310b500><<< 32134 1727204428.86518: stdout chunk (state=3): >>> <<< 32134 1727204428.86617: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30ee510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 32134 1727204428.86669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 32134 1727204428.86708: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py<<< 32134 1727204428.86714: stdout chunk (state=3): >>> <<< 32134 1727204428.86751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 32134 1727204428.86802: stdout chunk (state=3): >>> # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.86824: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.86868: stdout chunk (state=3): >>>import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2eefce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py<<< 32134 1727204428.86877: stdout chunk (state=3): >>> <<< 32134 1727204428.86892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc'<<< 32134 1727204428.86936: stdout chunk (state=3): >>> # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.86944: stdout chunk (state=3): >>> <<< 32134 1727204428.86960: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.86969: stdout chunk (state=3): >>> import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2f187d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f18530><<< 32134 1727204428.87010: stdout chunk (state=3): >>> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.87034: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.87041: stdout chunk (state=3): >>> import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2f18800><<< 32134 1727204428.87082: stdout chunk (state=3): >>> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.87096: stdout chunk (state=3): >>> <<< 32134 1727204428.87108: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.87147: stdout chunk (state=3): >>>import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2f189e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2eede80><<< 32134 1727204428.87153: stdout chunk (state=3): >>> <<< 32134 1727204428.87196: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 32134 1727204428.87507: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f1a030><<< 32134 1727204428.87545: stdout chunk (state=3): >>> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f18cb0> <<< 32134 1727204428.87593: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30eec00><<< 32134 1727204428.87597: stdout chunk (state=3): >>> <<< 32134 1727204428.87646: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py<<< 32134 1727204428.87723: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204428.87762: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 32134 1727204428.87848: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 32134 1727204428.87904: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f463c0> <<< 32134 1727204428.87982: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 32134 1727204428.88011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc'<<< 32134 1727204428.88046: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 32134 1727204428.88101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc'<<< 32134 1727204428.88183: stdout chunk (state=3): >>> import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f5e540> <<< 32134 1727204428.88247: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 32134 1727204428.88285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc'<<< 32134 1727204428.88292: stdout chunk (state=3): >>> <<< 32134 1727204428.88402: stdout chunk (state=3): >>>import 'ntpath' # <<< 32134 1727204428.88406: stdout chunk (state=3): >>> <<< 32134 1727204428.88452: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py<<< 32134 1727204428.88457: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc'<<< 32134 1727204428.88509: stdout chunk (state=3): >>> import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f972c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 32134 1727204428.88577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc'<<< 32134 1727204428.88624: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py<<< 32134 1727204428.88630: stdout chunk (state=3): >>> <<< 32134 1727204428.88702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc'<<< 32134 1727204428.88708: stdout chunk (state=3): >>> <<< 32134 1727204428.88984: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2fbda60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f973e0><<< 32134 1727204428.88988: stdout chunk (state=3): >>> <<< 32134 1727204428.89063: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f5f1d0><<< 32134 1727204428.89070: stdout chunk (state=3): >>> <<< 32134 1727204428.89118: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py<<< 32134 1727204428.89127: stdout chunk (state=3): >>> <<< 32134 1727204428.89156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 32134 1727204428.89158: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2d98440> <<< 32134 1727204428.89204: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f5d580><<< 32134 1727204428.89217: stdout chunk (state=3): >>> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f1af60><<< 32134 1727204428.89224: stdout chunk (state=3): >>> <<< 32134 1727204428.89381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc'<<< 32134 1727204428.89387: stdout chunk (state=3): >>> <<< 32134 1727204428.89488: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7efed2d986e0> <<< 32134 1727204428.89571: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_vfzbmu8r/ansible_stat_payload.zip' <<< 32134 1727204428.89604: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204428.89609: stdout chunk (state=3): >>> <<< 32134 1727204428.89901: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204428.89906: stdout chunk (state=3): >>> <<< 32134 1727204428.89978: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc'<<< 32134 1727204428.89985: stdout chunk (state=3): >>> <<< 32134 1727204428.90221: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 32134 1727204428.90325: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 32134 1727204428.90329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2df21e0> <<< 32134 1727204428.90359: stdout chunk (state=3): >>>import '_typing' # <<< 32134 1727204428.90748: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2dc90d0> <<< 32134 1727204428.90834: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2dc8230> <<< 32134 1727204428.90841: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204428.90844: stdout chunk (state=3): >>> import 'ansible' # <<< 32134 1727204428.90867: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.90915: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 32134 1727204428.90945: stdout chunk (state=3): >>> import 'ansible.module_utils' # <<< 32134 1727204428.90961: stdout chunk (state=3): >>> <<< 32134 1727204428.91197: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204428.93593: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204428.93607: stdout chunk (state=3): >>> <<< 32134 1727204428.95870: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py<<< 32134 1727204428.95884: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc'<<< 32134 1727204428.95915: stdout chunk (state=3): >>> <<< 32134 1727204428.95937: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2dcb1d0> <<< 32134 1727204428.95988: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py<<< 32134 1727204428.95996: stdout chunk (state=3): >>> <<< 32134 1727204428.96029: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204428.96071: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py<<< 32134 1727204428.96074: stdout chunk (state=3): >>> <<< 32134 1727204428.96109: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc'<<< 32134 1727204428.96112: stdout chunk (state=3): >>> <<< 32134 1727204428.96153: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 32134 1727204428.96167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc'<<< 32134 1727204428.96182: stdout chunk (state=3): >>> <<< 32134 1727204428.96222: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.96249: stdout chunk (state=3): >>> # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.96266: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2e1db50><<< 32134 1727204428.96324: stdout chunk (state=3): >>> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2e1d8e0><<< 32134 1727204428.96328: stdout chunk (state=3): >>> <<< 32134 1727204428.96386: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2e1d1f0><<< 32134 1727204428.96391: stdout chunk (state=3): >>> <<< 32134 1727204428.96419: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 32134 1727204428.96448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc'<<< 32134 1727204428.96513: stdout chunk (state=3): >>> import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2e1d940><<< 32134 1727204428.96518: stdout chunk (state=3): >>> <<< 32134 1727204428.96541: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed310b230> <<< 32134 1727204428.96555: stdout chunk (state=3): >>>import 'atexit' # <<< 32134 1727204428.96584: stdout chunk (state=3): >>> <<< 32134 1727204428.96614: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.96618: stdout chunk (state=3): >>> # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.96671: stdout chunk (state=3): >>> import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2e1e840> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.96677: stdout chunk (state=3): >>> <<< 32134 1727204428.96679: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.96701: stdout chunk (state=3): >>> import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2e1ea80><<< 32134 1727204428.96716: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py<<< 32134 1727204428.96739: stdout chunk (state=3): >>> <<< 32134 1727204428.96823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc'<<< 32134 1727204428.96862: stdout chunk (state=3): >>> import '_locale' # <<< 32134 1727204428.96865: stdout chunk (state=3): >>> <<< 32134 1727204428.96931: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2e1efc0><<< 32134 1727204428.96963: stdout chunk (state=3): >>> import 'pwd' # <<< 32134 1727204428.96967: stdout chunk (state=3): >>> <<< 32134 1727204428.97048: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc'<<< 32134 1727204428.97051: stdout chunk (state=3): >>> <<< 32134 1727204428.97124: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c80d40><<< 32134 1727204428.97128: stdout chunk (state=3): >>> <<< 32134 1727204428.97169: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.97216: stdout chunk (state=3): >>> import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2c82960> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py<<< 32134 1727204428.97227: stdout chunk (state=3): >>> <<< 32134 1727204428.97261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 32134 1727204428.97324: stdout chunk (state=3): >>> import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c83320><<< 32134 1727204428.97327: stdout chunk (state=3): >>> <<< 32134 1727204428.97374: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py<<< 32134 1727204428.97378: stdout chunk (state=3): >>> <<< 32134 1727204428.97430: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 32134 1727204428.97433: stdout chunk (state=3): >>> <<< 32134 1727204428.97462: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c84500><<< 32134 1727204428.97508: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 32134 1727204428.97607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 32134 1727204428.97630: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc'<<< 32134 1727204428.97732: stdout chunk (state=3): >>> import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c86fc0><<< 32134 1727204428.97745: stdout chunk (state=3): >>> <<< 32134 1727204428.97808: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.97842: stdout chunk (state=3): >>> # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.97853: stdout chunk (state=3): >>>import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2c870b0> <<< 32134 1727204428.97888: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c85280> <<< 32134 1727204428.97931: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 32134 1727204428.98017: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py<<< 32134 1727204428.98031: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc'<<< 32134 1727204428.98068: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 32134 1727204428.98129: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 32134 1727204428.98192: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 32134 1727204428.98195: stdout chunk (state=3): >>> <<< 32134 1727204428.98224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c8af00><<< 32134 1727204428.98260: stdout chunk (state=3): >>> import '_tokenize' # <<< 32134 1727204428.98276: stdout chunk (state=3): >>> <<< 32134 1727204428.98387: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c899d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c89730><<< 32134 1727204428.98432: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 32134 1727204428.98585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c8bd10> <<< 32134 1727204428.98648: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c85790> <<< 32134 1727204428.98707: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.98736: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.98756: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2cd3020> <<< 32134 1727204428.98804: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 32134 1727204428.98822: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2cd3260> <<< 32134 1727204428.98864: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 32134 1727204428.98898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 32134 1727204428.98949: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 32134 1727204428.98952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc'<<< 32134 1727204428.99043: stdout chunk (state=3): >>> # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204428.99056: stdout chunk (state=3): >>> import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2cd4d40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2cd4b00><<< 32134 1727204428.99100: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 32134 1727204428.99317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 32134 1727204428.99399: stdout chunk (state=3): >>> # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.99439: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204428.99464: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2cd72c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2cd5430> <<< 32134 1727204428.99591: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204428.99802: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2cdeae0> <<< 32134 1727204429.00010: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2cd7470> <<< 32134 1727204429.00144: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204429.00157: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204429.00237: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2cdf8c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204429.00263: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2cdfa70> <<< 32134 1727204429.00332: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204429.00380: stdout chunk (state=3): >>> # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2cdfbf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2cd3410><<< 32134 1727204429.00384: stdout chunk (state=3): >>> <<< 32134 1727204429.00416: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py<<< 32134 1727204429.00470: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py<<< 32134 1727204429.00482: stdout chunk (state=3): >>> <<< 32134 1727204429.00523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 32134 1727204429.00587: stdout chunk (state=3): >>> # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204429.00646: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204429.00662: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2ce34d0><<< 32134 1727204429.00804: stdout chunk (state=3): >>> <<< 32134 1727204429.01018: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204429.01066: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204429.01078: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2ce4920> <<< 32134 1727204429.01103: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2ce1c40> <<< 32134 1727204429.01169: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204429.01193: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2ce2ff0> <<< 32134 1727204429.01235: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2ce1850> # zipimport: zlib available<<< 32134 1727204429.01259: stdout chunk (state=3): >>> <<< 32134 1727204429.01291: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 32134 1727204429.01310: stdout chunk (state=3): >>> <<< 32134 1727204429.01327: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204429.01492: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32134 1727204429.01504: stdout chunk (state=3): >>> <<< 32134 1727204429.01671: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.01720: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.01747: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 32134 1727204429.01816: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204429.01824: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32134 1727204429.01868: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text' # <<< 32134 1727204429.01875: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.02120: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.02498: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.03570: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204429.03595: stdout chunk (state=3): >>> <<< 32134 1727204429.04787: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 32134 1727204429.04823: stdout chunk (state=3): >>> import 'ansible.module_utils.six.moves.collections_abc' # <<< 32134 1727204429.04878: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py<<< 32134 1727204429.04901: stdout chunk (state=3): >>> <<< 32134 1727204429.04943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204429.05027: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204429.05207: stdout chunk (state=3): >>> # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2d6cb60> <<< 32134 1727204429.05247: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 32134 1727204429.05307: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2d6d8e0><<< 32134 1727204429.05332: stdout chunk (state=3): >>> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2e1da00><<< 32134 1727204429.05343: stdout chunk (state=3): >>> <<< 32134 1727204429.05415: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 32134 1727204429.05446: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.05505: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.05560: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 32134 1727204429.05564: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.05868: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.06193: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 32134 1727204429.06215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 32134 1727204429.06257: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2d6d6a0> # zipimport: zlib available <<< 32134 1727204429.07503: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.08315: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204429.08340: stdout chunk (state=3): >>> <<< 32134 1727204429.08473: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.08632: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 32134 1727204429.08670: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.08741: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.08815: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 32134 1727204429.08846: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.09011: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.09220: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 32134 1727204429.09259: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.09316: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.09323: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 32134 1727204429.09347: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.09425: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204429.09435: stdout chunk (state=3): >>> <<< 32134 1727204429.09491: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 32134 1727204429.09526: stdout chunk (state=3): >>> # zipimport: zlib available <<< 32134 1727204429.10014: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.10626: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc'<<< 32134 1727204429.10629: stdout chunk (state=3): >>> <<< 32134 1727204429.10662: stdout chunk (state=3): >>>import '_ast' # <<< 32134 1727204429.10821: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2d6ea50><<< 32134 1727204429.10825: stdout chunk (state=3): >>> <<< 32134 1727204429.10859: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.10988: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204429.11004: stdout chunk (state=3): >>> <<< 32134 1727204429.11130: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 32134 1727204429.11133: stdout chunk (state=3): >>> <<< 32134 1727204429.11159: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 32134 1727204429.11211: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 32134 1727204429.11266: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc'<<< 32134 1727204429.11403: stdout chunk (state=3): >>> # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 32134 1727204429.11698: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2b7a330> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204429.11754: stdout chunk (state=3): >>> # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204429.11767: stdout chunk (state=3): >>> import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2b7ac60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2d6fa70> <<< 32134 1727204429.11951: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.11956: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # <<< 32134 1727204429.11970: stdout chunk (state=3): >>># zipimport: zlib available <<< 32134 1727204429.12052: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204429.12069: stdout chunk (state=3): >>> <<< 32134 1727204429.12229: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 32134 1727204429.12233: stdout chunk (state=3): >>> <<< 32134 1727204429.12365: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py<<< 32134 1727204429.12368: stdout chunk (state=3): >>> <<< 32134 1727204429.12459: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 32134 1727204429.12462: stdout chunk (state=3): >>> <<< 32134 1727204429.12624: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 32134 1727204429.12628: stdout chunk (state=3): >>> <<< 32134 1727204429.12650: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2b79970><<< 32134 1727204429.12737: stdout chunk (state=3): >>> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2b7ae40><<< 32134 1727204429.12740: stdout chunk (state=3): >>> <<< 32134 1727204429.12795: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 32134 1727204429.12798: stdout chunk (state=3): >>> <<< 32134 1727204429.12818: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 32134 1727204429.12838: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204429.12864: stdout chunk (state=3): >>> <<< 32134 1727204429.12963: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204429.13078: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32134 1727204429.13098: stdout chunk (state=3): >>> <<< 32134 1727204429.13144: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204429.13148: stdout chunk (state=3): >>> <<< 32134 1727204429.13228: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py<<< 32134 1727204429.13232: stdout chunk (state=3): >>> <<< 32134 1727204429.13245: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 32134 1727204429.13285: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py<<< 32134 1727204429.13298: stdout chunk (state=3): >>> <<< 32134 1727204429.13338: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 32134 1727204429.13355: stdout chunk (state=3): >>> <<< 32134 1727204429.13501: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc'<<< 32134 1727204429.13504: stdout chunk (state=3): >>> <<< 32134 1727204429.13546: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py<<< 32134 1727204429.13550: stdout chunk (state=3): >>> <<< 32134 1727204429.13579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 32134 1727204429.13698: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c0aed0><<< 32134 1727204429.13708: stdout chunk (state=3): >>> <<< 32134 1727204429.13802: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2b87d40><<< 32134 1727204429.13805: stdout chunk (state=3): >>> <<< 32134 1727204429.13954: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2b82e40> <<< 32134 1727204429.13985: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2b82c90> # destroy ansible.module_utils.distro<<< 32134 1727204429.14023: stdout chunk (state=3): >>> import 'ansible.module_utils.distro' # # zipimport: zlib available<<< 32134 1727204429.14027: stdout chunk (state=3): >>> <<< 32134 1727204429.14127: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # <<< 32134 1727204429.14130: stdout chunk (state=3): >>> <<< 32134 1727204429.14141: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 32134 1727204429.14396: stdout chunk (state=3): >>> import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 32134 1727204429.14584: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204429.14611: stdout chunk (state=3): >>> <<< 32134 1727204429.14995: stdout chunk (state=3): >>># zipimport: zlib available<<< 32134 1727204429.14998: stdout chunk (state=3): >>> <<< 32134 1727204429.15244: stdout chunk (state=3): >>> <<< 32134 1727204429.15247: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 32134 1727204429.15288: stdout chunk (state=3): >>># destroy __main__ <<< 32134 1727204429.15935: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 32134 1727204429.15940: stdout chunk (state=3): >>># clear sys.path_hooks <<< 32134 1727204429.16038: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 <<< 32134 1727204429.16041: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_exc <<< 32134 1727204429.16046: stdout chunk (state=3): >>># clear sys.last_type <<< 32134 1727204429.16140: stdout chunk (state=3): >>># clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path<<< 32134 1727204429.16144: stdout chunk (state=3): >>> # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys<<< 32134 1727204429.16175: stdout chunk (state=3): >>> # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp<<< 32134 1727204429.16227: stdout chunk (state=3): >>> # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath<<< 32134 1727204429.16271: stdout chunk (state=3): >>> # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools<<< 32134 1727204429.16275: stdout chunk (state=3): >>> # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct<<< 32134 1727204429.16331: stdout chunk (state=3): >>> # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc <<< 32134 1727204429.16335: stdout chunk (state=3): >>># cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect<<< 32134 1727204429.16376: stdout chunk (state=3): >>> # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib<<< 32134 1727204429.16429: stdout chunk (state=3): >>> # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing<<< 32134 1727204429.16432: stdout chunk (state=3): >>> # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit<<< 32134 1727204429.16468: stdout chunk (state=3): >>> # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string<<< 32134 1727204429.16528: stdout chunk (state=3): >>> # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array<<< 32134 1727204429.16532: stdout chunk (state=3): >>> # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes<<< 32134 1727204429.16571: stdout chunk (state=3): >>> # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast<<< 32134 1727204429.16637: stdout chunk (state=3): >>> # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec <<< 32134 1727204429.16646: stdout chunk (state=3): >>># destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse<<< 32134 1727204429.16670: stdout chunk (state=3): >>> # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules<<< 32134 1727204429.16799: stdout chunk (state=3): >>> <<< 32134 1727204429.17154: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 32134 1727204429.17170: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc <<< 32134 1727204429.17237: stdout chunk (state=3): >>># destroy importlib.util <<< 32134 1727204429.17287: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 32134 1727204429.17301: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma<<< 32134 1727204429.17344: stdout chunk (state=3): >>> # destroy zipfile._path # destroy zipfile<<< 32134 1727204429.17376: stdout chunk (state=3): >>> # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress<<< 32134 1727204429.17447: stdout chunk (state=3): >>> # destroy ntpath<<< 32134 1727204429.17496: stdout chunk (state=3): >>> # destroy importlib <<< 32134 1727204429.17500: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon<<< 32134 1727204429.17537: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale<<< 32134 1727204429.17578: stdout chunk (state=3): >>> # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess<<< 32134 1727204429.17620: stdout chunk (state=3): >>> # destroy syslog<<< 32134 1727204429.17641: stdout chunk (state=3): >>> # destroy uuid # destroy selectors <<< 32134 1727204429.17666: stdout chunk (state=3): >>># destroy errno # destroy array <<< 32134 1727204429.17946: stdout chunk (state=3): >>># destroy datetime # destroy _hashlib <<< 32134 1727204429.17949: stdout chunk (state=3): >>># destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon<<< 32134 1727204429.17974: stdout chunk (state=3): >>> # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback<<< 32134 1727204429.18016: stdout chunk (state=3): >>> # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize<<< 32134 1727204429.18051: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit <<< 32134 1727204429.18082: stdout chunk (state=3): >>># cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random<<< 32134 1727204429.18133: stdout chunk (state=3): >>> # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 32134 1727204429.18152: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants<<< 32134 1727204429.18176: stdout chunk (state=3): >>> # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections <<< 32134 1727204429.18216: stdout chunk (state=3): >>># destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig<<< 32134 1727204429.18278: stdout chunk (state=3): >>> # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat<<< 32134 1727204429.18285: stdout chunk (state=3): >>> # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external <<< 32134 1727204429.18337: stdout chunk (state=3): >>># cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 32134 1727204429.18365: stdout chunk (state=3): >>> # cleanup[3] wiping builtins<<< 32134 1727204429.18369: stdout chunk (state=3): >>> # destroy selinux._selinux # destroy systemd._daemon<<< 32134 1727204429.18399: stdout chunk (state=3): >>> # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 32134 1727204429.18601: stdout chunk (state=3): >>># destroy sys.monitoring<<< 32134 1727204429.18623: stdout chunk (state=3): >>> # destroy _socket<<< 32134 1727204429.18655: stdout chunk (state=3): >>> # destroy _collections <<< 32134 1727204429.18680: stdout chunk (state=3): >>># destroy platform<<< 32134 1727204429.18709: stdout chunk (state=3): >>> # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser<<< 32134 1727204429.18742: stdout chunk (state=3): >>> # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg <<< 32134 1727204429.18807: stdout chunk (state=3): >>># destroy contextlib # destroy _typing<<< 32134 1727204429.18836: stdout chunk (state=3): >>> # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse<<< 32134 1727204429.18876: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external<<< 32134 1727204429.18899: stdout chunk (state=3): >>> # destroy _imp # destroy _io # destroy marshal<<< 32134 1727204429.18939: stdout chunk (state=3): >>> # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib<<< 32134 1727204429.18991: stdout chunk (state=3): >>> <<< 32134 1727204429.19092: stdout chunk (state=3): >>># destroy codecs <<< 32134 1727204429.19109: stdout chunk (state=3): >>># destroy encodings.aliases # destroy encodings.utf_8 <<< 32134 1727204429.19136: stdout chunk (state=3): >>># destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit<<< 32134 1727204429.19177: stdout chunk (state=3): >>> # destroy _warnings # destroy math # destroy _bisect # destroy time<<< 32134 1727204429.19229: stdout chunk (state=3): >>> # destroy _random <<< 32134 1727204429.19241: stdout chunk (state=3): >>># destroy _weakref <<< 32134 1727204429.19281: stdout chunk (state=3): >>># destroy _operator # destroy _sha2<<< 32134 1727204429.19321: stdout chunk (state=3): >>> # destroy _string # destroy re # destroy itertools <<< 32134 1727204429.19346: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools<<< 32134 1727204429.19384: stdout chunk (state=3): >>> # destroy builtins # destroy _thread # clear sys.audit hooks <<< 32134 1727204429.19968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204429.20194: stderr chunk (state=3): >>><<< 32134 1727204429.20198: stdout chunk (state=3): >>><<< 32134 1727204429.20215: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed320c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed31dbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed320ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30210a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed3021fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed305fe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed305ff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed3097890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed3097f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed3077b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed3075280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed305d040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30bb7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30ba3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed3076270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30b8c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30ec770> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed305c2c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed30ecc20> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30ecad0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed30ecec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed305ade0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30ed5b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30ed280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30ee4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed31086e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed3109e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed310acf0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed310b350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed310a270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed310bdd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed310b500> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30ee510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2eefce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2f187d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f18530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2f18800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2f189e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2eede80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f1a030> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f18cb0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed30eec00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f463c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f5e540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f972c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2fbda60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f973e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f5f1d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2d98440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f5d580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2f1af60> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7efed2d986e0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_vfzbmu8r/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2df21e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2dc90d0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2dc8230> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2dcb1d0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2e1db50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2e1d8e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2e1d1f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2e1d940> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed310b230> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2e1e840> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2e1ea80> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2e1efc0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c80d40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2c82960> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c83320> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c84500> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c86fc0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2c870b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c85280> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c8af00> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c899d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c89730> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c8bd10> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c85790> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2cd3020> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2cd3260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2cd4d40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2cd4b00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2cd72c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2cd5430> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2cdeae0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2cd7470> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2cdf8c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2cdfa70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2cdfbf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2cd3410> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2ce34d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2ce4920> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2ce1c40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2ce2ff0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2ce1850> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2d6cb60> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2d6d8e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2e1da00> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2d6d6a0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2d6ea50> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2b7a330> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2b7ac60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2d6fa70> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efed2b79970> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2b7ae40> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2c0aed0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2b87d40> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2b82e40> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7efed2b82c90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 32134 1727204429.21217: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204428.5354059-32295-188807417674804/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204429.21226: _low_level_execute_command(): starting 32134 1727204429.21229: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204428.5354059-32295-188807417674804/ > /dev/null 2>&1 && sleep 0' 32134 1727204429.21231: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204429.21233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204429.21236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204429.21238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204429.21241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204429.21243: stderr chunk (state=3): >>>debug2: match not found <<< 32134 1727204429.21245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204429.21247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32134 1727204429.21249: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 32134 1727204429.21251: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32134 1727204429.21253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204429.21255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204429.21257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204429.21259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204429.21261: stderr chunk (state=3): >>>debug2: match found <<< 32134 1727204429.21263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204429.21265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204429.21271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204429.21274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204429.21276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204429.24198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204429.24276: stderr chunk (state=3): >>><<< 32134 1727204429.24296: stdout chunk (state=3): >>><<< 32134 1727204429.24397: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204429.24401: handler run complete 32134 1727204429.24404: attempt loop complete, returning result 32134 1727204429.24406: _execute() done 32134 1727204429.24409: dumping result to json 32134 1727204429.24410: done dumping result, returning 32134 1727204429.24412: done running TaskExecutor() for managed-node2/TASK: Check if system is ostree [12b410aa-8751-753f-5162-0000000000b6] 32134 1727204429.24414: sending task result for task 12b410aa-8751-753f-5162-0000000000b6 ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 32134 1727204429.24672: no more pending results, returning what we have 32134 1727204429.24676: results queue empty 32134 1727204429.24677: checking for any_errors_fatal 32134 1727204429.24685: done checking for any_errors_fatal 32134 1727204429.24686: checking for max_fail_percentage 32134 1727204429.24688: done checking for max_fail_percentage 32134 1727204429.24810: checking to see if all hosts have failed and the running result is not ok 32134 1727204429.24812: done checking to see if all hosts have failed 32134 1727204429.24813: getting the remaining hosts for this loop 32134 1727204429.24814: done getting the remaining hosts for this loop 32134 1727204429.24820: getting the next task for host managed-node2 32134 1727204429.24827: done getting next task for host managed-node2 32134 1727204429.24830: ^ task is: TASK: Set flag to indicate system is ostree 32134 1727204429.24833: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204429.24837: getting variables 32134 1727204429.24839: in VariableManager get_vars() 32134 1727204429.24872: Calling all_inventory to load vars for managed-node2 32134 1727204429.24876: Calling groups_inventory to load vars for managed-node2 32134 1727204429.24881: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204429.25001: Calling all_plugins_play to load vars for managed-node2 32134 1727204429.25006: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204429.25012: done sending task result for task 12b410aa-8751-753f-5162-0000000000b6 32134 1727204429.25015: WORKER PROCESS EXITING 32134 1727204429.25029: Calling groups_plugins_play to load vars for managed-node2 32134 1727204429.25365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204429.25602: done with get_vars() 32134 1727204429.25610: done getting variables 32134 1727204429.25684: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 15:00:29 -0400 (0:00:00.777) 0:00:03.661 ***** 32134 1727204429.25716: entering _queue_task() for managed-node2/set_fact 32134 1727204429.25718: Creating lock for set_fact 32134 1727204429.25933: worker is 1 (out of 1 available) 32134 1727204429.25948: exiting _queue_task() for managed-node2/set_fact 32134 1727204429.25960: done queuing things up, now waiting for results queue to drain 32134 1727204429.25962: waiting for pending results... 32134 1727204429.26122: running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree 32134 1727204429.26199: in run() - task 12b410aa-8751-753f-5162-0000000000b7 32134 1727204429.26210: variable 'ansible_search_path' from source: unknown 32134 1727204429.26214: variable 'ansible_search_path' from source: unknown 32134 1727204429.26247: calling self._execute() 32134 1727204429.26310: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.26319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.26329: variable 'omit' from source: magic vars 32134 1727204429.26729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204429.26934: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204429.26977: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204429.27007: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204429.27039: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204429.27132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204429.27172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204429.27200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204429.27231: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204429.27381: Evaluated conditional (not __network_is_ostree is defined): True 32134 1727204429.27399: variable 'omit' from source: magic vars 32134 1727204429.27444: variable 'omit' from source: magic vars 32134 1727204429.27577: variable '__ostree_booted_stat' from source: set_fact 32134 1727204429.27794: variable 'omit' from source: magic vars 32134 1727204429.27798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204429.27801: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204429.27803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204429.27806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204429.27808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204429.27811: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204429.27813: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.27815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.27913: Set connection var ansible_timeout to 10 32134 1727204429.27939: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204429.27943: Set connection var ansible_connection to ssh 32134 1727204429.27946: Set connection var ansible_shell_type to sh 32134 1727204429.27949: Set connection var ansible_shell_executable to /bin/sh 32134 1727204429.27956: Set connection var ansible_pipelining to False 32134 1727204429.27981: variable 'ansible_shell_executable' from source: unknown 32134 1727204429.27985: variable 'ansible_connection' from source: unknown 32134 1727204429.27988: variable 'ansible_module_compression' from source: unknown 32134 1727204429.27993: variable 'ansible_shell_type' from source: unknown 32134 1727204429.27996: variable 'ansible_shell_executable' from source: unknown 32134 1727204429.28002: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.28007: variable 'ansible_pipelining' from source: unknown 32134 1727204429.28010: variable 'ansible_timeout' from source: unknown 32134 1727204429.28019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.28158: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204429.28163: variable 'omit' from source: magic vars 32134 1727204429.28165: starting attempt loop 32134 1727204429.28168: running the handler 32134 1727204429.28170: handler run complete 32134 1727204429.28264: attempt loop complete, returning result 32134 1727204429.28267: _execute() done 32134 1727204429.28271: dumping result to json 32134 1727204429.28273: done dumping result, returning 32134 1727204429.28275: done running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree [12b410aa-8751-753f-5162-0000000000b7] 32134 1727204429.28277: sending task result for task 12b410aa-8751-753f-5162-0000000000b7 32134 1727204429.28339: done sending task result for task 12b410aa-8751-753f-5162-0000000000b7 32134 1727204429.28342: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 32134 1727204429.28420: no more pending results, returning what we have 32134 1727204429.28423: results queue empty 32134 1727204429.28424: checking for any_errors_fatal 32134 1727204429.28430: done checking for any_errors_fatal 32134 1727204429.28431: checking for max_fail_percentage 32134 1727204429.28432: done checking for max_fail_percentage 32134 1727204429.28433: checking to see if all hosts have failed and the running result is not ok 32134 1727204429.28434: done checking to see if all hosts have failed 32134 1727204429.28435: getting the remaining hosts for this loop 32134 1727204429.28436: done getting the remaining hosts for this loop 32134 1727204429.28440: getting the next task for host managed-node2 32134 1727204429.28448: done getting next task for host managed-node2 32134 1727204429.28450: ^ task is: TASK: Fix CentOS6 Base repo 32134 1727204429.28495: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204429.28500: getting variables 32134 1727204429.28502: in VariableManager get_vars() 32134 1727204429.28533: Calling all_inventory to load vars for managed-node2 32134 1727204429.28536: Calling groups_inventory to load vars for managed-node2 32134 1727204429.28540: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204429.28550: Calling all_plugins_play to load vars for managed-node2 32134 1727204429.28553: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204429.28562: Calling groups_plugins_play to load vars for managed-node2 32134 1727204429.28792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204429.28975: done with get_vars() 32134 1727204429.28983: done getting variables 32134 1727204429.29082: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 15:00:29 -0400 (0:00:00.033) 0:00:03.695 ***** 32134 1727204429.29106: entering _queue_task() for managed-node2/copy 32134 1727204429.29307: worker is 1 (out of 1 available) 32134 1727204429.29321: exiting _queue_task() for managed-node2/copy 32134 1727204429.29334: done queuing things up, now waiting for results queue to drain 32134 1727204429.29336: waiting for pending results... 32134 1727204429.29491: running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo 32134 1727204429.29570: in run() - task 12b410aa-8751-753f-5162-0000000000b9 32134 1727204429.29585: variable 'ansible_search_path' from source: unknown 32134 1727204429.29588: variable 'ansible_search_path' from source: unknown 32134 1727204429.29624: calling self._execute() 32134 1727204429.29683: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.29695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.29702: variable 'omit' from source: magic vars 32134 1727204429.30148: variable 'ansible_distribution' from source: facts 32134 1727204429.30166: Evaluated conditional (ansible_distribution == 'CentOS'): False 32134 1727204429.30169: when evaluation is False, skipping this task 32134 1727204429.30173: _execute() done 32134 1727204429.30175: dumping result to json 32134 1727204429.30181: done dumping result, returning 32134 1727204429.30187: done running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo [12b410aa-8751-753f-5162-0000000000b9] 32134 1727204429.30194: sending task result for task 12b410aa-8751-753f-5162-0000000000b9 32134 1727204429.30300: done sending task result for task 12b410aa-8751-753f-5162-0000000000b9 32134 1727204429.30303: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 32134 1727204429.30382: no more pending results, returning what we have 32134 1727204429.30386: results queue empty 32134 1727204429.30386: checking for any_errors_fatal 32134 1727204429.30392: done checking for any_errors_fatal 32134 1727204429.30393: checking for max_fail_percentage 32134 1727204429.30395: done checking for max_fail_percentage 32134 1727204429.30396: checking to see if all hosts have failed and the running result is not ok 32134 1727204429.30397: done checking to see if all hosts have failed 32134 1727204429.30398: getting the remaining hosts for this loop 32134 1727204429.30399: done getting the remaining hosts for this loop 32134 1727204429.30403: getting the next task for host managed-node2 32134 1727204429.30409: done getting next task for host managed-node2 32134 1727204429.30412: ^ task is: TASK: Include the task 'enable_epel.yml' 32134 1727204429.30416: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204429.30420: getting variables 32134 1727204429.30421: in VariableManager get_vars() 32134 1727204429.30445: Calling all_inventory to load vars for managed-node2 32134 1727204429.30447: Calling groups_inventory to load vars for managed-node2 32134 1727204429.30449: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204429.30457: Calling all_plugins_play to load vars for managed-node2 32134 1727204429.30459: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204429.30461: Calling groups_plugins_play to load vars for managed-node2 32134 1727204429.30637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204429.30857: done with get_vars() 32134 1727204429.30867: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 15:00:29 -0400 (0:00:00.018) 0:00:03.713 ***** 32134 1727204429.30966: entering _queue_task() for managed-node2/include_tasks 32134 1727204429.31215: worker is 1 (out of 1 available) 32134 1727204429.31227: exiting _queue_task() for managed-node2/include_tasks 32134 1727204429.31240: done queuing things up, now waiting for results queue to drain 32134 1727204429.31242: waiting for pending results... 32134 1727204429.31609: running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' 32134 1727204429.31656: in run() - task 12b410aa-8751-753f-5162-0000000000ba 32134 1727204429.31661: variable 'ansible_search_path' from source: unknown 32134 1727204429.31671: variable 'ansible_search_path' from source: unknown 32134 1727204429.31766: calling self._execute() 32134 1727204429.31823: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.31837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.31853: variable 'omit' from source: magic vars 32134 1727204429.32484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204429.34403: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204429.34425: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204429.34466: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204429.34527: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204429.34557: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204429.34645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204429.34680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204429.34715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204429.34764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204429.34781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204429.34919: variable '__network_is_ostree' from source: set_fact 32134 1727204429.34939: Evaluated conditional (not __network_is_ostree | d(false)): True 32134 1727204429.34955: _execute() done 32134 1727204429.34959: dumping result to json 32134 1727204429.34962: done dumping result, returning 32134 1727204429.34965: done running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' [12b410aa-8751-753f-5162-0000000000ba] 32134 1727204429.34967: sending task result for task 12b410aa-8751-753f-5162-0000000000ba 32134 1727204429.35198: no more pending results, returning what we have 32134 1727204429.35204: in VariableManager get_vars() 32134 1727204429.35233: Calling all_inventory to load vars for managed-node2 32134 1727204429.35236: Calling groups_inventory to load vars for managed-node2 32134 1727204429.35240: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204429.35251: Calling all_plugins_play to load vars for managed-node2 32134 1727204429.35254: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204429.35257: Calling groups_plugins_play to load vars for managed-node2 32134 1727204429.35476: done sending task result for task 12b410aa-8751-753f-5162-0000000000ba 32134 1727204429.35479: WORKER PROCESS EXITING 32134 1727204429.35515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204429.35841: done with get_vars() 32134 1727204429.35852: variable 'ansible_search_path' from source: unknown 32134 1727204429.35853: variable 'ansible_search_path' from source: unknown 32134 1727204429.35904: we have included files to process 32134 1727204429.35906: generating all_blocks data 32134 1727204429.35908: done generating all_blocks data 32134 1727204429.35919: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 32134 1727204429.35920: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 32134 1727204429.35924: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 32134 1727204429.36801: done processing included file 32134 1727204429.36803: iterating over new_blocks loaded from include file 32134 1727204429.36804: in VariableManager get_vars() 32134 1727204429.36815: done with get_vars() 32134 1727204429.36817: filtering new block on tags 32134 1727204429.36835: done filtering new block on tags 32134 1727204429.36837: in VariableManager get_vars() 32134 1727204429.36845: done with get_vars() 32134 1727204429.36846: filtering new block on tags 32134 1727204429.36854: done filtering new block on tags 32134 1727204429.36855: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node2 32134 1727204429.36860: extending task lists for all hosts with included blocks 32134 1727204429.36944: done extending task lists 32134 1727204429.36946: done processing included files 32134 1727204429.36946: results queue empty 32134 1727204429.36947: checking for any_errors_fatal 32134 1727204429.36949: done checking for any_errors_fatal 32134 1727204429.36950: checking for max_fail_percentage 32134 1727204429.36951: done checking for max_fail_percentage 32134 1727204429.36951: checking to see if all hosts have failed and the running result is not ok 32134 1727204429.36952: done checking to see if all hosts have failed 32134 1727204429.36952: getting the remaining hosts for this loop 32134 1727204429.36953: done getting the remaining hosts for this loop 32134 1727204429.36955: getting the next task for host managed-node2 32134 1727204429.36958: done getting next task for host managed-node2 32134 1727204429.36960: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 32134 1727204429.36962: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204429.36963: getting variables 32134 1727204429.36964: in VariableManager get_vars() 32134 1727204429.36971: Calling all_inventory to load vars for managed-node2 32134 1727204429.36972: Calling groups_inventory to load vars for managed-node2 32134 1727204429.36974: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204429.36978: Calling all_plugins_play to load vars for managed-node2 32134 1727204429.36984: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204429.36986: Calling groups_plugins_play to load vars for managed-node2 32134 1727204429.37130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204429.37300: done with get_vars() 32134 1727204429.37308: done getting variables 32134 1727204429.37365: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 32134 1727204429.37522: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 15:00:29 -0400 (0:00:00.065) 0:00:03.779 ***** 32134 1727204429.37563: entering _queue_task() for managed-node2/command 32134 1727204429.37564: Creating lock for command 32134 1727204429.37785: worker is 1 (out of 1 available) 32134 1727204429.37800: exiting _queue_task() for managed-node2/command 32134 1727204429.37813: done queuing things up, now waiting for results queue to drain 32134 1727204429.37816: waiting for pending results... 32134 1727204429.37972: running TaskExecutor() for managed-node2/TASK: Create EPEL 39 32134 1727204429.38058: in run() - task 12b410aa-8751-753f-5162-0000000000d4 32134 1727204429.38066: variable 'ansible_search_path' from source: unknown 32134 1727204429.38069: variable 'ansible_search_path' from source: unknown 32134 1727204429.38101: calling self._execute() 32134 1727204429.38167: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.38177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.38185: variable 'omit' from source: magic vars 32134 1727204429.38554: variable 'ansible_distribution' from source: facts 32134 1727204429.38559: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 32134 1727204429.38562: when evaluation is False, skipping this task 32134 1727204429.38565: _execute() done 32134 1727204429.38567: dumping result to json 32134 1727204429.38570: done dumping result, returning 32134 1727204429.38572: done running TaskExecutor() for managed-node2/TASK: Create EPEL 39 [12b410aa-8751-753f-5162-0000000000d4] 32134 1727204429.38574: sending task result for task 12b410aa-8751-753f-5162-0000000000d4 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 32134 1727204429.38860: no more pending results, returning what we have 32134 1727204429.38864: results queue empty 32134 1727204429.38865: checking for any_errors_fatal 32134 1727204429.38867: done checking for any_errors_fatal 32134 1727204429.38868: checking for max_fail_percentage 32134 1727204429.38870: done checking for max_fail_percentage 32134 1727204429.38870: checking to see if all hosts have failed and the running result is not ok 32134 1727204429.38872: done checking to see if all hosts have failed 32134 1727204429.38873: getting the remaining hosts for this loop 32134 1727204429.38876: done getting the remaining hosts for this loop 32134 1727204429.38880: getting the next task for host managed-node2 32134 1727204429.38886: done getting next task for host managed-node2 32134 1727204429.38891: ^ task is: TASK: Install yum-utils package 32134 1727204429.38895: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204429.38899: getting variables 32134 1727204429.38903: in VariableManager get_vars() 32134 1727204429.38929: Calling all_inventory to load vars for managed-node2 32134 1727204429.38932: Calling groups_inventory to load vars for managed-node2 32134 1727204429.38936: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204429.38947: Calling all_plugins_play to load vars for managed-node2 32134 1727204429.38951: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204429.38955: Calling groups_plugins_play to load vars for managed-node2 32134 1727204429.39198: done sending task result for task 12b410aa-8751-753f-5162-0000000000d4 32134 1727204429.39201: WORKER PROCESS EXITING 32134 1727204429.39265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204429.39610: done with get_vars() 32134 1727204429.39618: done getting variables 32134 1727204429.39705: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 15:00:29 -0400 (0:00:00.021) 0:00:03.801 ***** 32134 1727204429.39731: entering _queue_task() for managed-node2/package 32134 1727204429.39732: Creating lock for package 32134 1727204429.39947: worker is 1 (out of 1 available) 32134 1727204429.39960: exiting _queue_task() for managed-node2/package 32134 1727204429.39971: done queuing things up, now waiting for results queue to drain 32134 1727204429.39973: waiting for pending results... 32134 1727204429.40126: running TaskExecutor() for managed-node2/TASK: Install yum-utils package 32134 1727204429.40209: in run() - task 12b410aa-8751-753f-5162-0000000000d5 32134 1727204429.40223: variable 'ansible_search_path' from source: unknown 32134 1727204429.40227: variable 'ansible_search_path' from source: unknown 32134 1727204429.40254: calling self._execute() 32134 1727204429.40321: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.40332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.40336: variable 'omit' from source: magic vars 32134 1727204429.40643: variable 'ansible_distribution' from source: facts 32134 1727204429.40660: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 32134 1727204429.40664: when evaluation is False, skipping this task 32134 1727204429.40667: _execute() done 32134 1727204429.40669: dumping result to json 32134 1727204429.40672: done dumping result, returning 32134 1727204429.40675: done running TaskExecutor() for managed-node2/TASK: Install yum-utils package [12b410aa-8751-753f-5162-0000000000d5] 32134 1727204429.40681: sending task result for task 12b410aa-8751-753f-5162-0000000000d5 32134 1727204429.40782: done sending task result for task 12b410aa-8751-753f-5162-0000000000d5 32134 1727204429.40785: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 32134 1727204429.40836: no more pending results, returning what we have 32134 1727204429.40839: results queue empty 32134 1727204429.40840: checking for any_errors_fatal 32134 1727204429.40846: done checking for any_errors_fatal 32134 1727204429.40847: checking for max_fail_percentage 32134 1727204429.40848: done checking for max_fail_percentage 32134 1727204429.40849: checking to see if all hosts have failed and the running result is not ok 32134 1727204429.40850: done checking to see if all hosts have failed 32134 1727204429.40851: getting the remaining hosts for this loop 32134 1727204429.40852: done getting the remaining hosts for this loop 32134 1727204429.40856: getting the next task for host managed-node2 32134 1727204429.40861: done getting next task for host managed-node2 32134 1727204429.40864: ^ task is: TASK: Enable EPEL 7 32134 1727204429.40867: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204429.40870: getting variables 32134 1727204429.40871: in VariableManager get_vars() 32134 1727204429.40906: Calling all_inventory to load vars for managed-node2 32134 1727204429.40909: Calling groups_inventory to load vars for managed-node2 32134 1727204429.40914: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204429.40922: Calling all_plugins_play to load vars for managed-node2 32134 1727204429.40924: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204429.40926: Calling groups_plugins_play to load vars for managed-node2 32134 1727204429.41074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204429.41256: done with get_vars() 32134 1727204429.41263: done getting variables 32134 1727204429.41309: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 15:00:29 -0400 (0:00:00.015) 0:00:03.817 ***** 32134 1727204429.41334: entering _queue_task() for managed-node2/command 32134 1727204429.41520: worker is 1 (out of 1 available) 32134 1727204429.41531: exiting _queue_task() for managed-node2/command 32134 1727204429.41544: done queuing things up, now waiting for results queue to drain 32134 1727204429.41546: waiting for pending results... 32134 1727204429.41708: running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 32134 1727204429.41783: in run() - task 12b410aa-8751-753f-5162-0000000000d6 32134 1727204429.41798: variable 'ansible_search_path' from source: unknown 32134 1727204429.41802: variable 'ansible_search_path' from source: unknown 32134 1727204429.41833: calling self._execute() 32134 1727204429.41896: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.41906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.41914: variable 'omit' from source: magic vars 32134 1727204429.42232: variable 'ansible_distribution' from source: facts 32134 1727204429.42243: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 32134 1727204429.42247: when evaluation is False, skipping this task 32134 1727204429.42250: _execute() done 32134 1727204429.42253: dumping result to json 32134 1727204429.42259: done dumping result, returning 32134 1727204429.42266: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 [12b410aa-8751-753f-5162-0000000000d6] 32134 1727204429.42271: sending task result for task 12b410aa-8751-753f-5162-0000000000d6 32134 1727204429.42366: done sending task result for task 12b410aa-8751-753f-5162-0000000000d6 32134 1727204429.42369: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 32134 1727204429.42417: no more pending results, returning what we have 32134 1727204429.42420: results queue empty 32134 1727204429.42421: checking for any_errors_fatal 32134 1727204429.42425: done checking for any_errors_fatal 32134 1727204429.42426: checking for max_fail_percentage 32134 1727204429.42428: done checking for max_fail_percentage 32134 1727204429.42429: checking to see if all hosts have failed and the running result is not ok 32134 1727204429.42430: done checking to see if all hosts have failed 32134 1727204429.42431: getting the remaining hosts for this loop 32134 1727204429.42432: done getting the remaining hosts for this loop 32134 1727204429.42436: getting the next task for host managed-node2 32134 1727204429.42442: done getting next task for host managed-node2 32134 1727204429.42444: ^ task is: TASK: Enable EPEL 8 32134 1727204429.42448: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204429.42450: getting variables 32134 1727204429.42452: in VariableManager get_vars() 32134 1727204429.42477: Calling all_inventory to load vars for managed-node2 32134 1727204429.42480: Calling groups_inventory to load vars for managed-node2 32134 1727204429.42483: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204429.42499: Calling all_plugins_play to load vars for managed-node2 32134 1727204429.42501: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204429.42504: Calling groups_plugins_play to load vars for managed-node2 32134 1727204429.42676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204429.42856: done with get_vars() 32134 1727204429.42863: done getting variables 32134 1727204429.42908: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 15:00:29 -0400 (0:00:00.015) 0:00:03.833 ***** 32134 1727204429.42932: entering _queue_task() for managed-node2/command 32134 1727204429.43112: worker is 1 (out of 1 available) 32134 1727204429.43125: exiting _queue_task() for managed-node2/command 32134 1727204429.43137: done queuing things up, now waiting for results queue to drain 32134 1727204429.43139: waiting for pending results... 32134 1727204429.43287: running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 32134 1727204429.43363: in run() - task 12b410aa-8751-753f-5162-0000000000d7 32134 1727204429.43380: variable 'ansible_search_path' from source: unknown 32134 1727204429.43384: variable 'ansible_search_path' from source: unknown 32134 1727204429.43409: calling self._execute() 32134 1727204429.43469: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.43477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.43492: variable 'omit' from source: magic vars 32134 1727204429.43787: variable 'ansible_distribution' from source: facts 32134 1727204429.43806: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 32134 1727204429.43812: when evaluation is False, skipping this task 32134 1727204429.43815: _execute() done 32134 1727204429.43817: dumping result to json 32134 1727204429.43820: done dumping result, returning 32134 1727204429.43833: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 [12b410aa-8751-753f-5162-0000000000d7] 32134 1727204429.43836: sending task result for task 12b410aa-8751-753f-5162-0000000000d7 32134 1727204429.43927: done sending task result for task 12b410aa-8751-753f-5162-0000000000d7 32134 1727204429.43932: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 32134 1727204429.43977: no more pending results, returning what we have 32134 1727204429.43980: results queue empty 32134 1727204429.43981: checking for any_errors_fatal 32134 1727204429.43986: done checking for any_errors_fatal 32134 1727204429.43987: checking for max_fail_percentage 32134 1727204429.43988: done checking for max_fail_percentage 32134 1727204429.43991: checking to see if all hosts have failed and the running result is not ok 32134 1727204429.43992: done checking to see if all hosts have failed 32134 1727204429.43993: getting the remaining hosts for this loop 32134 1727204429.43995: done getting the remaining hosts for this loop 32134 1727204429.43998: getting the next task for host managed-node2 32134 1727204429.44006: done getting next task for host managed-node2 32134 1727204429.44009: ^ task is: TASK: Enable EPEL 6 32134 1727204429.44013: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204429.44016: getting variables 32134 1727204429.44017: in VariableManager get_vars() 32134 1727204429.44046: Calling all_inventory to load vars for managed-node2 32134 1727204429.44048: Calling groups_inventory to load vars for managed-node2 32134 1727204429.44051: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204429.44058: Calling all_plugins_play to load vars for managed-node2 32134 1727204429.44061: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204429.44063: Calling groups_plugins_play to load vars for managed-node2 32134 1727204429.44210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204429.44388: done with get_vars() 32134 1727204429.44397: done getting variables 32134 1727204429.44441: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 15:00:29 -0400 (0:00:00.015) 0:00:03.848 ***** 32134 1727204429.44461: entering _queue_task() for managed-node2/copy 32134 1727204429.44630: worker is 1 (out of 1 available) 32134 1727204429.44643: exiting _queue_task() for managed-node2/copy 32134 1727204429.44655: done queuing things up, now waiting for results queue to drain 32134 1727204429.44657: waiting for pending results... 32134 1727204429.44819: running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 32134 1727204429.44891: in run() - task 12b410aa-8751-753f-5162-0000000000d9 32134 1727204429.44903: variable 'ansible_search_path' from source: unknown 32134 1727204429.44907: variable 'ansible_search_path' from source: unknown 32134 1727204429.44939: calling self._execute() 32134 1727204429.44992: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.45003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.45013: variable 'omit' from source: magic vars 32134 1727204429.45531: variable 'ansible_distribution' from source: facts 32134 1727204429.45540: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 32134 1727204429.45544: when evaluation is False, skipping this task 32134 1727204429.45548: _execute() done 32134 1727204429.45552: dumping result to json 32134 1727204429.45557: done dumping result, returning 32134 1727204429.45564: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 [12b410aa-8751-753f-5162-0000000000d9] 32134 1727204429.45569: sending task result for task 12b410aa-8751-753f-5162-0000000000d9 32134 1727204429.45666: done sending task result for task 12b410aa-8751-753f-5162-0000000000d9 32134 1727204429.45669: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 32134 1727204429.45716: no more pending results, returning what we have 32134 1727204429.45720: results queue empty 32134 1727204429.45721: checking for any_errors_fatal 32134 1727204429.45725: done checking for any_errors_fatal 32134 1727204429.45726: checking for max_fail_percentage 32134 1727204429.45727: done checking for max_fail_percentage 32134 1727204429.45728: checking to see if all hosts have failed and the running result is not ok 32134 1727204429.45729: done checking to see if all hosts have failed 32134 1727204429.45730: getting the remaining hosts for this loop 32134 1727204429.45731: done getting the remaining hosts for this loop 32134 1727204429.45735: getting the next task for host managed-node2 32134 1727204429.45743: done getting next task for host managed-node2 32134 1727204429.45745: ^ task is: TASK: Set network provider to 'nm' 32134 1727204429.45748: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204429.45751: getting variables 32134 1727204429.45752: in VariableManager get_vars() 32134 1727204429.45777: Calling all_inventory to load vars for managed-node2 32134 1727204429.45780: Calling groups_inventory to load vars for managed-node2 32134 1727204429.45783: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204429.45794: Calling all_plugins_play to load vars for managed-node2 32134 1727204429.45796: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204429.45799: Calling groups_plugins_play to load vars for managed-node2 32134 1727204429.46097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204429.46268: done with get_vars() 32134 1727204429.46275: done getting variables 32134 1727204429.46319: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:13 Tuesday 24 September 2024 15:00:29 -0400 (0:00:00.018) 0:00:03.867 ***** 32134 1727204429.46341: entering _queue_task() for managed-node2/set_fact 32134 1727204429.46518: worker is 1 (out of 1 available) 32134 1727204429.46531: exiting _queue_task() for managed-node2/set_fact 32134 1727204429.46541: done queuing things up, now waiting for results queue to drain 32134 1727204429.46543: waiting for pending results... 32134 1727204429.46697: running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' 32134 1727204429.46767: in run() - task 12b410aa-8751-753f-5162-000000000007 32134 1727204429.46783: variable 'ansible_search_path' from source: unknown 32134 1727204429.46819: calling self._execute() 32134 1727204429.46881: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.46885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.46903: variable 'omit' from source: magic vars 32134 1727204429.46983: variable 'omit' from source: magic vars 32134 1727204429.47019: variable 'omit' from source: magic vars 32134 1727204429.47047: variable 'omit' from source: magic vars 32134 1727204429.47083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204429.47123: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204429.47139: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204429.47157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204429.47168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204429.47196: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204429.47201: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.47206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.47293: Set connection var ansible_timeout to 10 32134 1727204429.47305: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204429.47308: Set connection var ansible_connection to ssh 32134 1727204429.47311: Set connection var ansible_shell_type to sh 32134 1727204429.47320: Set connection var ansible_shell_executable to /bin/sh 32134 1727204429.47326: Set connection var ansible_pipelining to False 32134 1727204429.47349: variable 'ansible_shell_executable' from source: unknown 32134 1727204429.47353: variable 'ansible_connection' from source: unknown 32134 1727204429.47356: variable 'ansible_module_compression' from source: unknown 32134 1727204429.47358: variable 'ansible_shell_type' from source: unknown 32134 1727204429.47361: variable 'ansible_shell_executable' from source: unknown 32134 1727204429.47364: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.47369: variable 'ansible_pipelining' from source: unknown 32134 1727204429.47372: variable 'ansible_timeout' from source: unknown 32134 1727204429.47377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.47497: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204429.47507: variable 'omit' from source: magic vars 32134 1727204429.47515: starting attempt loop 32134 1727204429.47519: running the handler 32134 1727204429.47530: handler run complete 32134 1727204429.47538: attempt loop complete, returning result 32134 1727204429.47541: _execute() done 32134 1727204429.47545: dumping result to json 32134 1727204429.47555: done dumping result, returning 32134 1727204429.47560: done running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' [12b410aa-8751-753f-5162-000000000007] 32134 1727204429.47563: sending task result for task 12b410aa-8751-753f-5162-000000000007 32134 1727204429.47652: done sending task result for task 12b410aa-8751-753f-5162-000000000007 32134 1727204429.47658: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 32134 1727204429.47719: no more pending results, returning what we have 32134 1727204429.47722: results queue empty 32134 1727204429.47723: checking for any_errors_fatal 32134 1727204429.47729: done checking for any_errors_fatal 32134 1727204429.47730: checking for max_fail_percentage 32134 1727204429.47731: done checking for max_fail_percentage 32134 1727204429.47732: checking to see if all hosts have failed and the running result is not ok 32134 1727204429.47733: done checking to see if all hosts have failed 32134 1727204429.47734: getting the remaining hosts for this loop 32134 1727204429.47735: done getting the remaining hosts for this loop 32134 1727204429.47738: getting the next task for host managed-node2 32134 1727204429.47744: done getting next task for host managed-node2 32134 1727204429.47746: ^ task is: TASK: meta (flush_handlers) 32134 1727204429.47748: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204429.47752: getting variables 32134 1727204429.47753: in VariableManager get_vars() 32134 1727204429.47779: Calling all_inventory to load vars for managed-node2 32134 1727204429.47781: Calling groups_inventory to load vars for managed-node2 32134 1727204429.47783: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204429.47793: Calling all_plugins_play to load vars for managed-node2 32134 1727204429.47795: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204429.47797: Calling groups_plugins_play to load vars for managed-node2 32134 1727204429.47942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204429.48134: done with get_vars() 32134 1727204429.48141: done getting variables 32134 1727204429.48192: in VariableManager get_vars() 32134 1727204429.48199: Calling all_inventory to load vars for managed-node2 32134 1727204429.48202: Calling groups_inventory to load vars for managed-node2 32134 1727204429.48205: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204429.48208: Calling all_plugins_play to load vars for managed-node2 32134 1727204429.48210: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204429.48216: Calling groups_plugins_play to load vars for managed-node2 32134 1727204429.48337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204429.48503: done with get_vars() 32134 1727204429.48515: done queuing things up, now waiting for results queue to drain 32134 1727204429.48517: results queue empty 32134 1727204429.48517: checking for any_errors_fatal 32134 1727204429.48519: done checking for any_errors_fatal 32134 1727204429.48519: checking for max_fail_percentage 32134 1727204429.48520: done checking for max_fail_percentage 32134 1727204429.48521: checking to see if all hosts have failed and the running result is not ok 32134 1727204429.48521: done checking to see if all hosts have failed 32134 1727204429.48522: getting the remaining hosts for this loop 32134 1727204429.48522: done getting the remaining hosts for this loop 32134 1727204429.48524: getting the next task for host managed-node2 32134 1727204429.48528: done getting next task for host managed-node2 32134 1727204429.48529: ^ task is: TASK: meta (flush_handlers) 32134 1727204429.48531: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204429.48537: getting variables 32134 1727204429.48538: in VariableManager get_vars() 32134 1727204429.48545: Calling all_inventory to load vars for managed-node2 32134 1727204429.48547: Calling groups_inventory to load vars for managed-node2 32134 1727204429.48549: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204429.48552: Calling all_plugins_play to load vars for managed-node2 32134 1727204429.48554: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204429.48556: Calling groups_plugins_play to load vars for managed-node2 32134 1727204429.48677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204429.48863: done with get_vars() 32134 1727204429.48871: done getting variables 32134 1727204429.48907: in VariableManager get_vars() 32134 1727204429.48915: Calling all_inventory to load vars for managed-node2 32134 1727204429.48917: Calling groups_inventory to load vars for managed-node2 32134 1727204429.48919: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204429.48922: Calling all_plugins_play to load vars for managed-node2 32134 1727204429.48924: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204429.48926: Calling groups_plugins_play to load vars for managed-node2 32134 1727204429.49045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204429.49218: done with get_vars() 32134 1727204429.49227: done queuing things up, now waiting for results queue to drain 32134 1727204429.49228: results queue empty 32134 1727204429.49229: checking for any_errors_fatal 32134 1727204429.49230: done checking for any_errors_fatal 32134 1727204429.49230: checking for max_fail_percentage 32134 1727204429.49231: done checking for max_fail_percentage 32134 1727204429.49232: checking to see if all hosts have failed and the running result is not ok 32134 1727204429.49232: done checking to see if all hosts have failed 32134 1727204429.49233: getting the remaining hosts for this loop 32134 1727204429.49233: done getting the remaining hosts for this loop 32134 1727204429.49235: getting the next task for host managed-node2 32134 1727204429.49237: done getting next task for host managed-node2 32134 1727204429.49238: ^ task is: None 32134 1727204429.49239: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204429.49240: done queuing things up, now waiting for results queue to drain 32134 1727204429.49240: results queue empty 32134 1727204429.49241: checking for any_errors_fatal 32134 1727204429.49241: done checking for any_errors_fatal 32134 1727204429.49242: checking for max_fail_percentage 32134 1727204429.49243: done checking for max_fail_percentage 32134 1727204429.49243: checking to see if all hosts have failed and the running result is not ok 32134 1727204429.49244: done checking to see if all hosts have failed 32134 1727204429.49245: getting the next task for host managed-node2 32134 1727204429.49247: done getting next task for host managed-node2 32134 1727204429.49247: ^ task is: None 32134 1727204429.49248: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204429.49286: in VariableManager get_vars() 32134 1727204429.49307: done with get_vars() 32134 1727204429.49315: in VariableManager get_vars() 32134 1727204429.49325: done with get_vars() 32134 1727204429.49328: variable 'omit' from source: magic vars 32134 1727204429.49350: in VariableManager get_vars() 32134 1727204429.49359: done with get_vars() 32134 1727204429.49375: variable 'omit' from source: magic vars PLAY [Play for testing ipv6 disabled] ****************************************** 32134 1727204429.49594: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 32134 1727204429.49619: getting the remaining hosts for this loop 32134 1727204429.49621: done getting the remaining hosts for this loop 32134 1727204429.49623: getting the next task for host managed-node2 32134 1727204429.49625: done getting next task for host managed-node2 32134 1727204429.49626: ^ task is: TASK: Gathering Facts 32134 1727204429.49628: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204429.49630: getting variables 32134 1727204429.49631: in VariableManager get_vars() 32134 1727204429.49640: Calling all_inventory to load vars for managed-node2 32134 1727204429.49642: Calling groups_inventory to load vars for managed-node2 32134 1727204429.49643: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204429.49647: Calling all_plugins_play to load vars for managed-node2 32134 1727204429.49657: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204429.49660: Calling groups_plugins_play to load vars for managed-node2 32134 1727204429.49814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204429.49985: done with get_vars() 32134 1727204429.49993: done getting variables 32134 1727204429.50025: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:3 Tuesday 24 September 2024 15:00:29 -0400 (0:00:00.037) 0:00:03.904 ***** 32134 1727204429.50043: entering _queue_task() for managed-node2/gather_facts 32134 1727204429.50221: worker is 1 (out of 1 available) 32134 1727204429.50233: exiting _queue_task() for managed-node2/gather_facts 32134 1727204429.50243: done queuing things up, now waiting for results queue to drain 32134 1727204429.50246: waiting for pending results... 32134 1727204429.50398: running TaskExecutor() for managed-node2/TASK: Gathering Facts 32134 1727204429.50469: in run() - task 12b410aa-8751-753f-5162-0000000000ff 32134 1727204429.50483: variable 'ansible_search_path' from source: unknown 32134 1727204429.50520: calling self._execute() 32134 1727204429.50587: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.50592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.50604: variable 'omit' from source: magic vars 32134 1727204429.50912: variable 'ansible_distribution_major_version' from source: facts 32134 1727204429.50924: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204429.50932: variable 'omit' from source: magic vars 32134 1727204429.50956: variable 'omit' from source: magic vars 32134 1727204429.50986: variable 'omit' from source: magic vars 32134 1727204429.51024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204429.51054: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204429.51076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204429.51092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204429.51103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204429.51133: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204429.51138: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.51141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.51230: Set connection var ansible_timeout to 10 32134 1727204429.51242: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204429.51245: Set connection var ansible_connection to ssh 32134 1727204429.51248: Set connection var ansible_shell_type to sh 32134 1727204429.51255: Set connection var ansible_shell_executable to /bin/sh 32134 1727204429.51266: Set connection var ansible_pipelining to False 32134 1727204429.51283: variable 'ansible_shell_executable' from source: unknown 32134 1727204429.51288: variable 'ansible_connection' from source: unknown 32134 1727204429.51291: variable 'ansible_module_compression' from source: unknown 32134 1727204429.51301: variable 'ansible_shell_type' from source: unknown 32134 1727204429.51304: variable 'ansible_shell_executable' from source: unknown 32134 1727204429.51307: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204429.51309: variable 'ansible_pipelining' from source: unknown 32134 1727204429.51311: variable 'ansible_timeout' from source: unknown 32134 1727204429.51319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204429.51470: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204429.51482: variable 'omit' from source: magic vars 32134 1727204429.51485: starting attempt loop 32134 1727204429.51494: running the handler 32134 1727204429.51506: variable 'ansible_facts' from source: unknown 32134 1727204429.51528: _low_level_execute_command(): starting 32134 1727204429.51535: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204429.52063: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204429.52100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204429.52104: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204429.52106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204429.52161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204429.52164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204429.52222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204429.54649: stdout chunk (state=3): >>>/root <<< 32134 1727204429.54835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204429.54886: stderr chunk (state=3): >>><<< 32134 1727204429.54892: stdout chunk (state=3): >>><<< 32134 1727204429.54917: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204429.54930: _low_level_execute_command(): starting 32134 1727204429.54936: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204429.549141-32332-197868557412919 `" && echo ansible-tmp-1727204429.549141-32332-197868557412919="` echo /root/.ansible/tmp/ansible-tmp-1727204429.549141-32332-197868557412919 `" ) && sleep 0' 32134 1727204429.55432: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204429.55436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204429.55438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204429.55441: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204429.55475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204429.55492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204429.55513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204429.55554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204429.58420: stdout chunk (state=3): >>>ansible-tmp-1727204429.549141-32332-197868557412919=/root/.ansible/tmp/ansible-tmp-1727204429.549141-32332-197868557412919 <<< 32134 1727204429.58682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204429.58686: stdout chunk (state=3): >>><<< 32134 1727204429.58688: stderr chunk (state=3): >>><<< 32134 1727204429.58735: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204429.549141-32332-197868557412919=/root/.ansible/tmp/ansible-tmp-1727204429.549141-32332-197868557412919 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204429.58792: variable 'ansible_module_compression' from source: unknown 32134 1727204429.58862: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 32134 1727204429.58934: variable 'ansible_facts' from source: unknown 32134 1727204429.59064: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204429.549141-32332-197868557412919/AnsiballZ_setup.py 32134 1727204429.59204: Sending initial data 32134 1727204429.59207: Sent initial data (153 bytes) 32134 1727204429.59786: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204429.59792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204429.59795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204429.59797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204429.59799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204429.59815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204429.59857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204429.59872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204429.59933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204429.62440: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204429.62484: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204429.62551: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpe8rv332m /root/.ansible/tmp/ansible-tmp-1727204429.549141-32332-197868557412919/AnsiballZ_setup.py <<< 32134 1727204429.62555: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204429.549141-32332-197868557412919/AnsiballZ_setup.py" <<< 32134 1727204429.62591: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpe8rv332m" to remote "/root/.ansible/tmp/ansible-tmp-1727204429.549141-32332-197868557412919/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204429.549141-32332-197868557412919/AnsiballZ_setup.py" <<< 32134 1727204429.65523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204429.65527: stdout chunk (state=3): >>><<< 32134 1727204429.65530: stderr chunk (state=3): >>><<< 32134 1727204429.65532: done transferring module to remote 32134 1727204429.65535: _low_level_execute_command(): starting 32134 1727204429.65537: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204429.549141-32332-197868557412919/ /root/.ansible/tmp/ansible-tmp-1727204429.549141-32332-197868557412919/AnsiballZ_setup.py && sleep 0' 32134 1727204429.66742: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204429.66948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204429.66993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204429.69941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204429.70111: stderr chunk (state=3): >>><<< 32134 1727204429.70123: stdout chunk (state=3): >>><<< 32134 1727204429.70408: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204429.70412: _low_level_execute_command(): starting 32134 1727204429.70415: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204429.549141-32332-197868557412919/AnsiballZ_setup.py && sleep 0' 32134 1727204429.70952: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204429.70965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204429.70981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204429.71006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204429.71112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204429.71127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204429.71144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204429.71160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204429.71249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204430.65630: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2808, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 909, "free": 2808}, "nocache": {"free": 3446, "used": 271}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis<<< 32134 1727204430.65645: stdout chunk (state=3): >>>_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 934, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251144724480, "block_size": 4096, "block_total": 64479564, "block_available": 61314630, "block_used": 3164934, "inode_total": 16384000, "inode_available": 16302235, "inode_used": 81765, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_loadavg": {"1m": 0.68505859375, "5m": 0.69384765625, "15m": 0.46630859375}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "30", "epoch": "1727204430", "epoch_int": "1727204430", "date": "2024-09-24", "time": "15:00:30", "iso8601_micro": "2024-09-24T19:00:30.589812Z", "iso8601": "2024-09-24T19:00:30Z", "iso8601_basic": "20240924T150030589812", "iso8601_basic_short": "20240924T150030", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "c<<< 32134 1727204430.65685: stdout chunk (state=3): >>>rashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fi<<< 32134 1727204430.65693: stdout chunk (state=3): >>>xed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 32134 1727204430.68665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204430.68734: stderr chunk (state=3): >>><<< 32134 1727204430.68737: stdout chunk (state=3): >>><<< 32134 1727204430.68774: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2808, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 909, "free": 2808}, "nocache": {"free": 3446, "used": 271}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 934, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251144724480, "block_size": 4096, "block_total": 64479564, "block_available": 61314630, "block_used": 3164934, "inode_total": 16384000, "inode_available": 16302235, "inode_used": 81765, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_loadavg": {"1m": 0.68505859375, "5m": 0.69384765625, "15m": 0.46630859375}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "30", "epoch": "1727204430", "epoch_int": "1727204430", "date": "2024-09-24", "time": "15:00:30", "iso8601_micro": "2024-09-24T19:00:30.589812Z", "iso8601": "2024-09-24T19:00:30Z", "iso8601_basic": "20240924T150030589812", "iso8601_basic_short": "20240924T150030", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204430.69107: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204429.549141-32332-197868557412919/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204430.69133: _low_level_execute_command(): starting 32134 1727204430.69138: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204429.549141-32332-197868557412919/ > /dev/null 2>&1 && sleep 0' 32134 1727204430.69622: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204430.69625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204430.69628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204430.69631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204430.69633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204430.69693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204430.69697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204430.69739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204430.72673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204430.72677: stdout chunk (state=3): >>><<< 32134 1727204430.72679: stderr chunk (state=3): >>><<< 32134 1727204430.72709: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204430.72727: handler run complete 32134 1727204430.73098: variable 'ansible_facts' from source: unknown 32134 1727204430.73107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204430.73632: variable 'ansible_facts' from source: unknown 32134 1727204430.73766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204430.73975: attempt loop complete, returning result 32134 1727204430.73979: _execute() done 32134 1727204430.73984: dumping result to json 32134 1727204430.74043: done dumping result, returning 32134 1727204430.74051: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-753f-5162-0000000000ff] 32134 1727204430.74056: sending task result for task 12b410aa-8751-753f-5162-0000000000ff 32134 1727204430.74388: done sending task result for task 12b410aa-8751-753f-5162-0000000000ff ok: [managed-node2] 32134 1727204430.74703: no more pending results, returning what we have 32134 1727204430.74705: results queue empty 32134 1727204430.74706: checking for any_errors_fatal 32134 1727204430.74707: done checking for any_errors_fatal 32134 1727204430.74707: checking for max_fail_percentage 32134 1727204430.74709: done checking for max_fail_percentage 32134 1727204430.74709: checking to see if all hosts have failed and the running result is not ok 32134 1727204430.74710: done checking to see if all hosts have failed 32134 1727204430.74711: getting the remaining hosts for this loop 32134 1727204430.74713: done getting the remaining hosts for this loop 32134 1727204430.74716: getting the next task for host managed-node2 32134 1727204430.74720: done getting next task for host managed-node2 32134 1727204430.74722: ^ task is: TASK: meta (flush_handlers) 32134 1727204430.74723: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204430.74726: getting variables 32134 1727204430.74727: in VariableManager get_vars() 32134 1727204430.74750: Calling all_inventory to load vars for managed-node2 32134 1727204430.74752: Calling groups_inventory to load vars for managed-node2 32134 1727204430.74754: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204430.74764: Calling all_plugins_play to load vars for managed-node2 32134 1727204430.74766: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204430.74770: Calling groups_plugins_play to load vars for managed-node2 32134 1727204430.74913: WORKER PROCESS EXITING 32134 1727204430.74926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204430.75120: done with get_vars() 32134 1727204430.75130: done getting variables 32134 1727204430.75185: in VariableManager get_vars() 32134 1727204430.75197: Calling all_inventory to load vars for managed-node2 32134 1727204430.75199: Calling groups_inventory to load vars for managed-node2 32134 1727204430.75201: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204430.75204: Calling all_plugins_play to load vars for managed-node2 32134 1727204430.75206: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204430.75208: Calling groups_plugins_play to load vars for managed-node2 32134 1727204430.75337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204430.75591: done with get_vars() 32134 1727204430.75607: done queuing things up, now waiting for results queue to drain 32134 1727204430.75609: results queue empty 32134 1727204430.75611: checking for any_errors_fatal 32134 1727204430.75618: done checking for any_errors_fatal 32134 1727204430.75619: checking for max_fail_percentage 32134 1727204430.75620: done checking for max_fail_percentage 32134 1727204430.75621: checking to see if all hosts have failed and the running result is not ok 32134 1727204430.75626: done checking to see if all hosts have failed 32134 1727204430.75627: getting the remaining hosts for this loop 32134 1727204430.75628: done getting the remaining hosts for this loop 32134 1727204430.75631: getting the next task for host managed-node2 32134 1727204430.75636: done getting next task for host managed-node2 32134 1727204430.75638: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 32134 1727204430.75640: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204430.75642: getting variables 32134 1727204430.75644: in VariableManager get_vars() 32134 1727204430.75656: Calling all_inventory to load vars for managed-node2 32134 1727204430.75659: Calling groups_inventory to load vars for managed-node2 32134 1727204430.75662: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204430.75667: Calling all_plugins_play to load vars for managed-node2 32134 1727204430.75670: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204430.75674: Calling groups_plugins_play to load vars for managed-node2 32134 1727204430.75880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204430.76187: done with get_vars() 32134 1727204430.76199: done getting variables 32134 1727204430.76252: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 32134 1727204430.76427: variable 'type' from source: play vars 32134 1727204430.76433: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:10 Tuesday 24 September 2024 15:00:30 -0400 (0:00:01.264) 0:00:05.168 ***** 32134 1727204430.76475: entering _queue_task() for managed-node2/set_fact 32134 1727204430.76769: worker is 1 (out of 1 available) 32134 1727204430.76783: exiting _queue_task() for managed-node2/set_fact 32134 1727204430.76802: done queuing things up, now waiting for results queue to drain 32134 1727204430.76805: waiting for pending results... 32134 1727204430.76962: running TaskExecutor() for managed-node2/TASK: Set type=veth and interface=ethtest0 32134 1727204430.77046: in run() - task 12b410aa-8751-753f-5162-00000000000b 32134 1727204430.77055: variable 'ansible_search_path' from source: unknown 32134 1727204430.77086: calling self._execute() 32134 1727204430.77244: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204430.77248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204430.77262: variable 'omit' from source: magic vars 32134 1727204430.77546: variable 'ansible_distribution_major_version' from source: facts 32134 1727204430.77558: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204430.77565: variable 'omit' from source: magic vars 32134 1727204430.77596: variable 'omit' from source: magic vars 32134 1727204430.77617: variable 'type' from source: play vars 32134 1727204430.77675: variable 'type' from source: play vars 32134 1727204430.77685: variable 'interface' from source: play vars 32134 1727204430.77741: variable 'interface' from source: play vars 32134 1727204430.77755: variable 'omit' from source: magic vars 32134 1727204430.77791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204430.77827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204430.77845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204430.77861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204430.77872: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204430.77902: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204430.77906: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204430.77909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204430.77997: Set connection var ansible_timeout to 10 32134 1727204430.78009: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204430.78015: Set connection var ansible_connection to ssh 32134 1727204430.78017: Set connection var ansible_shell_type to sh 32134 1727204430.78032: Set connection var ansible_shell_executable to /bin/sh 32134 1727204430.78037: Set connection var ansible_pipelining to False 32134 1727204430.78052: variable 'ansible_shell_executable' from source: unknown 32134 1727204430.78055: variable 'ansible_connection' from source: unknown 32134 1727204430.78058: variable 'ansible_module_compression' from source: unknown 32134 1727204430.78062: variable 'ansible_shell_type' from source: unknown 32134 1727204430.78065: variable 'ansible_shell_executable' from source: unknown 32134 1727204430.78070: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204430.78075: variable 'ansible_pipelining' from source: unknown 32134 1727204430.78078: variable 'ansible_timeout' from source: unknown 32134 1727204430.78083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204430.78205: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204430.78217: variable 'omit' from source: magic vars 32134 1727204430.78221: starting attempt loop 32134 1727204430.78223: running the handler 32134 1727204430.78237: handler run complete 32134 1727204430.78248: attempt loop complete, returning result 32134 1727204430.78255: _execute() done 32134 1727204430.78262: dumping result to json 32134 1727204430.78266: done dumping result, returning 32134 1727204430.78273: done running TaskExecutor() for managed-node2/TASK: Set type=veth and interface=ethtest0 [12b410aa-8751-753f-5162-00000000000b] 32134 1727204430.78278: sending task result for task 12b410aa-8751-753f-5162-00000000000b 32134 1727204430.78366: done sending task result for task 12b410aa-8751-753f-5162-00000000000b 32134 1727204430.78370: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 32134 1727204430.78548: no more pending results, returning what we have 32134 1727204430.78551: results queue empty 32134 1727204430.78552: checking for any_errors_fatal 32134 1727204430.78554: done checking for any_errors_fatal 32134 1727204430.78555: checking for max_fail_percentage 32134 1727204430.78556: done checking for max_fail_percentage 32134 1727204430.78557: checking to see if all hosts have failed and the running result is not ok 32134 1727204430.78558: done checking to see if all hosts have failed 32134 1727204430.78559: getting the remaining hosts for this loop 32134 1727204430.78561: done getting the remaining hosts for this loop 32134 1727204430.78564: getting the next task for host managed-node2 32134 1727204430.78568: done getting next task for host managed-node2 32134 1727204430.78571: ^ task is: TASK: Include the task 'show_interfaces.yml' 32134 1727204430.78574: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204430.78577: getting variables 32134 1727204430.78578: in VariableManager get_vars() 32134 1727204430.78675: Calling all_inventory to load vars for managed-node2 32134 1727204430.78679: Calling groups_inventory to load vars for managed-node2 32134 1727204430.78682: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204430.78695: Calling all_plugins_play to load vars for managed-node2 32134 1727204430.78698: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204430.78703: Calling groups_plugins_play to load vars for managed-node2 32134 1727204430.79235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204430.79573: done with get_vars() 32134 1727204430.79584: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:14 Tuesday 24 September 2024 15:00:30 -0400 (0:00:00.032) 0:00:05.200 ***** 32134 1727204430.79698: entering _queue_task() for managed-node2/include_tasks 32134 1727204430.80039: worker is 1 (out of 1 available) 32134 1727204430.80194: exiting _queue_task() for managed-node2/include_tasks 32134 1727204430.80205: done queuing things up, now waiting for results queue to drain 32134 1727204430.80207: waiting for pending results... 32134 1727204430.80338: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 32134 1727204430.80549: in run() - task 12b410aa-8751-753f-5162-00000000000c 32134 1727204430.80553: variable 'ansible_search_path' from source: unknown 32134 1727204430.80556: calling self._execute() 32134 1727204430.80608: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204430.80621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204430.80636: variable 'omit' from source: magic vars 32134 1727204430.81079: variable 'ansible_distribution_major_version' from source: facts 32134 1727204430.81111: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204430.81124: _execute() done 32134 1727204430.81133: dumping result to json 32134 1727204430.81141: done dumping result, returning 32134 1727204430.81152: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-753f-5162-00000000000c] 32134 1727204430.81162: sending task result for task 12b410aa-8751-753f-5162-00000000000c 32134 1727204430.81335: no more pending results, returning what we have 32134 1727204430.81341: in VariableManager get_vars() 32134 1727204430.81386: Calling all_inventory to load vars for managed-node2 32134 1727204430.81392: Calling groups_inventory to load vars for managed-node2 32134 1727204430.81395: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204430.81413: Calling all_plugins_play to load vars for managed-node2 32134 1727204430.81416: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204430.81421: Calling groups_plugins_play to load vars for managed-node2 32134 1727204430.81908: done sending task result for task 12b410aa-8751-753f-5162-00000000000c 32134 1727204430.81912: WORKER PROCESS EXITING 32134 1727204430.81946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204430.82282: done with get_vars() 32134 1727204430.82293: variable 'ansible_search_path' from source: unknown 32134 1727204430.82307: we have included files to process 32134 1727204430.82309: generating all_blocks data 32134 1727204430.82310: done generating all_blocks data 32134 1727204430.82311: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32134 1727204430.82312: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32134 1727204430.82315: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32134 1727204430.82510: in VariableManager get_vars() 32134 1727204430.82534: done with get_vars() 32134 1727204430.82670: done processing included file 32134 1727204430.82672: iterating over new_blocks loaded from include file 32134 1727204430.82674: in VariableManager get_vars() 32134 1727204430.82700: done with get_vars() 32134 1727204430.82702: filtering new block on tags 32134 1727204430.82723: done filtering new block on tags 32134 1727204430.82726: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 32134 1727204430.82732: extending task lists for all hosts with included blocks 32134 1727204430.84677: done extending task lists 32134 1727204430.84680: done processing included files 32134 1727204430.84681: results queue empty 32134 1727204430.84681: checking for any_errors_fatal 32134 1727204430.84685: done checking for any_errors_fatal 32134 1727204430.84686: checking for max_fail_percentage 32134 1727204430.84688: done checking for max_fail_percentage 32134 1727204430.84690: checking to see if all hosts have failed and the running result is not ok 32134 1727204430.84762: done checking to see if all hosts have failed 32134 1727204430.84763: getting the remaining hosts for this loop 32134 1727204430.84764: done getting the remaining hosts for this loop 32134 1727204430.84768: getting the next task for host managed-node2 32134 1727204430.84773: done getting next task for host managed-node2 32134 1727204430.84775: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 32134 1727204430.84778: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204430.84781: getting variables 32134 1727204430.84782: in VariableManager get_vars() 32134 1727204430.84800: Calling all_inventory to load vars for managed-node2 32134 1727204430.84803: Calling groups_inventory to load vars for managed-node2 32134 1727204430.84805: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204430.84812: Calling all_plugins_play to load vars for managed-node2 32134 1727204430.84815: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204430.84819: Calling groups_plugins_play to load vars for managed-node2 32134 1727204430.85288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204430.85694: done with get_vars() 32134 1727204430.85707: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:00:30 -0400 (0:00:00.061) 0:00:05.262 ***** 32134 1727204430.85811: entering _queue_task() for managed-node2/include_tasks 32134 1727204430.86303: worker is 1 (out of 1 available) 32134 1727204430.86317: exiting _queue_task() for managed-node2/include_tasks 32134 1727204430.86329: done queuing things up, now waiting for results queue to drain 32134 1727204430.86332: waiting for pending results... 32134 1727204430.86535: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 32134 1727204430.86680: in run() - task 12b410aa-8751-753f-5162-000000000115 32134 1727204430.86702: variable 'ansible_search_path' from source: unknown 32134 1727204430.86716: variable 'ansible_search_path' from source: unknown 32134 1727204430.86771: calling self._execute() 32134 1727204430.86879: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204430.86900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204430.86916: variable 'omit' from source: magic vars 32134 1727204430.87391: variable 'ansible_distribution_major_version' from source: facts 32134 1727204430.87417: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204430.87497: _execute() done 32134 1727204430.87501: dumping result to json 32134 1727204430.87506: done dumping result, returning 32134 1727204430.87511: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-753f-5162-000000000115] 32134 1727204430.87514: sending task result for task 12b410aa-8751-753f-5162-000000000115 32134 1727204430.87634: no more pending results, returning what we have 32134 1727204430.87641: in VariableManager get_vars() 32134 1727204430.87802: Calling all_inventory to load vars for managed-node2 32134 1727204430.87806: Calling groups_inventory to load vars for managed-node2 32134 1727204430.87809: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204430.87826: Calling all_plugins_play to load vars for managed-node2 32134 1727204430.87830: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204430.87835: Calling groups_plugins_play to load vars for managed-node2 32134 1727204430.88375: done sending task result for task 12b410aa-8751-753f-5162-000000000115 32134 1727204430.88379: WORKER PROCESS EXITING 32134 1727204430.88412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204430.89114: done with get_vars() 32134 1727204430.89124: variable 'ansible_search_path' from source: unknown 32134 1727204430.89126: variable 'ansible_search_path' from source: unknown 32134 1727204430.89293: we have included files to process 32134 1727204430.89295: generating all_blocks data 32134 1727204430.89297: done generating all_blocks data 32134 1727204430.89299: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32134 1727204430.89301: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32134 1727204430.89303: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32134 1727204430.89999: done processing included file 32134 1727204430.90002: iterating over new_blocks loaded from include file 32134 1727204430.90004: in VariableManager get_vars() 32134 1727204430.90024: done with get_vars() 32134 1727204430.90026: filtering new block on tags 32134 1727204430.90054: done filtering new block on tags 32134 1727204430.90057: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 32134 1727204430.90063: extending task lists for all hosts with included blocks 32134 1727204430.90210: done extending task lists 32134 1727204430.90211: done processing included files 32134 1727204430.90212: results queue empty 32134 1727204430.90213: checking for any_errors_fatal 32134 1727204430.90217: done checking for any_errors_fatal 32134 1727204430.90218: checking for max_fail_percentage 32134 1727204430.90219: done checking for max_fail_percentage 32134 1727204430.90220: checking to see if all hosts have failed and the running result is not ok 32134 1727204430.90221: done checking to see if all hosts have failed 32134 1727204430.90222: getting the remaining hosts for this loop 32134 1727204430.90223: done getting the remaining hosts for this loop 32134 1727204430.90226: getting the next task for host managed-node2 32134 1727204430.90231: done getting next task for host managed-node2 32134 1727204430.90234: ^ task is: TASK: Gather current interface info 32134 1727204430.90237: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204430.90240: getting variables 32134 1727204430.90241: in VariableManager get_vars() 32134 1727204430.90259: Calling all_inventory to load vars for managed-node2 32134 1727204430.90262: Calling groups_inventory to load vars for managed-node2 32134 1727204430.90265: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204430.90271: Calling all_plugins_play to load vars for managed-node2 32134 1727204430.90274: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204430.90278: Calling groups_plugins_play to load vars for managed-node2 32134 1727204430.90521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204430.90868: done with get_vars() 32134 1727204430.90880: done getting variables 32134 1727204430.90947: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:00:30 -0400 (0:00:00.051) 0:00:05.313 ***** 32134 1727204430.90984: entering _queue_task() for managed-node2/command 32134 1727204430.91347: worker is 1 (out of 1 available) 32134 1727204430.91360: exiting _queue_task() for managed-node2/command 32134 1727204430.91373: done queuing things up, now waiting for results queue to drain 32134 1727204430.91375: waiting for pending results... 32134 1727204430.91569: running TaskExecutor() for managed-node2/TASK: Gather current interface info 32134 1727204430.91677: in run() - task 12b410aa-8751-753f-5162-000000000192 32134 1727204430.91689: variable 'ansible_search_path' from source: unknown 32134 1727204430.91697: variable 'ansible_search_path' from source: unknown 32134 1727204430.91736: calling self._execute() 32134 1727204430.91824: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204430.91833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204430.91845: variable 'omit' from source: magic vars 32134 1727204430.92977: variable 'ansible_distribution_major_version' from source: facts 32134 1727204430.92988: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204430.92999: variable 'omit' from source: magic vars 32134 1727204430.93134: variable 'omit' from source: magic vars 32134 1727204430.93138: variable 'omit' from source: magic vars 32134 1727204430.93295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204430.93299: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204430.93303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204430.93305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204430.93307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204430.93394: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204430.93398: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204430.93401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204430.93448: Set connection var ansible_timeout to 10 32134 1727204430.93467: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204430.93470: Set connection var ansible_connection to ssh 32134 1727204430.93472: Set connection var ansible_shell_type to sh 32134 1727204430.93481: Set connection var ansible_shell_executable to /bin/sh 32134 1727204430.93491: Set connection var ansible_pipelining to False 32134 1727204430.93519: variable 'ansible_shell_executable' from source: unknown 32134 1727204430.93523: variable 'ansible_connection' from source: unknown 32134 1727204430.93643: variable 'ansible_module_compression' from source: unknown 32134 1727204430.93646: variable 'ansible_shell_type' from source: unknown 32134 1727204430.93649: variable 'ansible_shell_executable' from source: unknown 32134 1727204430.93652: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204430.93654: variable 'ansible_pipelining' from source: unknown 32134 1727204430.93657: variable 'ansible_timeout' from source: unknown 32134 1727204430.93659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204430.93753: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204430.93757: variable 'omit' from source: magic vars 32134 1727204430.93760: starting attempt loop 32134 1727204430.93763: running the handler 32134 1727204430.93765: _low_level_execute_command(): starting 32134 1727204430.93767: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204430.94596: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204430.94600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204430.94603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204430.94607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204430.94609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204430.94612: stderr chunk (state=3): >>>debug2: match not found <<< 32134 1727204430.94613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204430.94616: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32134 1727204430.94683: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 32134 1727204430.94694: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 32134 1727204430.94711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204430.94732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204430.94816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204430.96592: stdout chunk (state=3): >>>/root <<< 32134 1727204430.96908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204430.96950: stderr chunk (state=3): >>><<< 32134 1727204430.97008: stdout chunk (state=3): >>><<< 32134 1727204430.97045: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204430.97069: _low_level_execute_command(): starting 32134 1727204430.97082: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204430.9705374-32389-66431293083778 `" && echo ansible-tmp-1727204430.9705374-32389-66431293083778="` echo /root/.ansible/tmp/ansible-tmp-1727204430.9705374-32389-66431293083778 `" ) && sleep 0' 32134 1727204430.98508: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204430.98673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204430.98719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204430.98762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204431.00849: stdout chunk (state=3): >>>ansible-tmp-1727204430.9705374-32389-66431293083778=/root/.ansible/tmp/ansible-tmp-1727204430.9705374-32389-66431293083778 <<< 32134 1727204431.00969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204431.01157: stderr chunk (state=3): >>><<< 32134 1727204431.01396: stdout chunk (state=3): >>><<< 32134 1727204431.01400: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204430.9705374-32389-66431293083778=/root/.ansible/tmp/ansible-tmp-1727204430.9705374-32389-66431293083778 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204431.01403: variable 'ansible_module_compression' from source: unknown 32134 1727204431.01439: ANSIBALLZ: Using generic lock for ansible.legacy.command 32134 1727204431.01695: ANSIBALLZ: Acquiring lock 32134 1727204431.01699: ANSIBALLZ: Lock acquired: 140589353832608 32134 1727204431.01701: ANSIBALLZ: Creating module 32134 1727204431.27713: ANSIBALLZ: Writing module into payload 32134 1727204431.28132: ANSIBALLZ: Writing module 32134 1727204431.28136: ANSIBALLZ: Renaming module 32134 1727204431.28138: ANSIBALLZ: Done creating module 32134 1727204431.28141: variable 'ansible_facts' from source: unknown 32134 1727204431.28246: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204430.9705374-32389-66431293083778/AnsiballZ_command.py 32134 1727204431.28585: Sending initial data 32134 1727204431.28598: Sent initial data (155 bytes) 32134 1727204431.29219: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204431.29235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204431.29251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204431.29271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204431.29334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204431.29397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204431.29421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204431.29444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204431.29516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204431.31249: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204431.31316: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204431.31358: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpg7v8qzv_ /root/.ansible/tmp/ansible-tmp-1727204430.9705374-32389-66431293083778/AnsiballZ_command.py <<< 32134 1727204431.31374: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204430.9705374-32389-66431293083778/AnsiballZ_command.py" <<< 32134 1727204431.31400: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpg7v8qzv_" to remote "/root/.ansible/tmp/ansible-tmp-1727204430.9705374-32389-66431293083778/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204430.9705374-32389-66431293083778/AnsiballZ_command.py" <<< 32134 1727204431.32567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204431.32608: stderr chunk (state=3): >>><<< 32134 1727204431.32618: stdout chunk (state=3): >>><<< 32134 1727204431.32647: done transferring module to remote 32134 1727204431.32664: _low_level_execute_command(): starting 32134 1727204431.32675: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204430.9705374-32389-66431293083778/ /root/.ansible/tmp/ansible-tmp-1727204430.9705374-32389-66431293083778/AnsiballZ_command.py && sleep 0' 32134 1727204431.33328: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204431.33344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204431.33359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204431.33377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204431.33397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204431.33410: stderr chunk (state=3): >>>debug2: match not found <<< 32134 1727204431.33515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204431.33547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204431.33569: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204431.33587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204431.33665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204431.35795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204431.36010: stdout chunk (state=3): >>><<< 32134 1727204431.36013: stderr chunk (state=3): >>><<< 32134 1727204431.36016: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204431.36019: _low_level_execute_command(): starting 32134 1727204431.36022: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204430.9705374-32389-66431293083778/AnsiballZ_command.py && sleep 0' 32134 1727204431.37274: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204431.37352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204431.37459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204431.37493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204431.37534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204431.55767: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:31.552970", "end": "2024-09-24 15:00:31.556766", "delta": "0:00:00.003796", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32134 1727204431.57435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204431.57494: stderr chunk (state=3): >>><<< 32134 1727204431.57498: stdout chunk (state=3): >>><<< 32134 1727204431.57517: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:31.552970", "end": "2024-09-24 15:00:31.556766", "delta": "0:00:00.003796", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204431.57596: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204430.9705374-32389-66431293083778/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204431.57599: _low_level_execute_command(): starting 32134 1727204431.57602: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204430.9705374-32389-66431293083778/ > /dev/null 2>&1 && sleep 0' 32134 1727204431.58047: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204431.58050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204431.58054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204431.58056: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204431.58059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204431.58107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204431.58129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204431.58162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204431.60168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204431.60207: stderr chunk (state=3): >>><<< 32134 1727204431.60211: stdout chunk (state=3): >>><<< 32134 1727204431.60397: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204431.60401: handler run complete 32134 1727204431.60404: Evaluated conditional (False): False 32134 1727204431.60406: attempt loop complete, returning result 32134 1727204431.60408: _execute() done 32134 1727204431.60410: dumping result to json 32134 1727204431.60415: done dumping result, returning 32134 1727204431.60417: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [12b410aa-8751-753f-5162-000000000192] 32134 1727204431.60419: sending task result for task 12b410aa-8751-753f-5162-000000000192 32134 1727204431.60504: done sending task result for task 12b410aa-8751-753f-5162-000000000192 32134 1727204431.60508: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003796", "end": "2024-09-24 15:00:31.556766", "rc": 0, "start": "2024-09-24 15:00:31.552970" } STDOUT: bonding_masters eth0 lo 32134 1727204431.60777: no more pending results, returning what we have 32134 1727204431.60781: results queue empty 32134 1727204431.60783: checking for any_errors_fatal 32134 1727204431.60784: done checking for any_errors_fatal 32134 1727204431.60785: checking for max_fail_percentage 32134 1727204431.60787: done checking for max_fail_percentage 32134 1727204431.60788: checking to see if all hosts have failed and the running result is not ok 32134 1727204431.60896: done checking to see if all hosts have failed 32134 1727204431.60898: getting the remaining hosts for this loop 32134 1727204431.60900: done getting the remaining hosts for this loop 32134 1727204431.60906: getting the next task for host managed-node2 32134 1727204431.60916: done getting next task for host managed-node2 32134 1727204431.60920: ^ task is: TASK: Set current_interfaces 32134 1727204431.60924: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204431.60928: getting variables 32134 1727204431.60930: in VariableManager get_vars() 32134 1727204431.61029: Calling all_inventory to load vars for managed-node2 32134 1727204431.61033: Calling groups_inventory to load vars for managed-node2 32134 1727204431.61036: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204431.61049: Calling all_plugins_play to load vars for managed-node2 32134 1727204431.61052: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204431.61174: Calling groups_plugins_play to load vars for managed-node2 32134 1727204431.61564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204431.61938: done with get_vars() 32134 1727204431.61952: done getting variables 32134 1727204431.62038: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:00:31 -0400 (0:00:00.710) 0:00:06.024 ***** 32134 1727204431.62079: entering _queue_task() for managed-node2/set_fact 32134 1727204431.62521: worker is 1 (out of 1 available) 32134 1727204431.62532: exiting _queue_task() for managed-node2/set_fact 32134 1727204431.62543: done queuing things up, now waiting for results queue to drain 32134 1727204431.62545: waiting for pending results... 32134 1727204431.62784: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 32134 1727204431.63009: in run() - task 12b410aa-8751-753f-5162-000000000193 32134 1727204431.63021: variable 'ansible_search_path' from source: unknown 32134 1727204431.63025: variable 'ansible_search_path' from source: unknown 32134 1727204431.63027: calling self._execute() 32134 1727204431.63060: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204431.63068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204431.63078: variable 'omit' from source: magic vars 32134 1727204431.63382: variable 'ansible_distribution_major_version' from source: facts 32134 1727204431.63416: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204431.63420: variable 'omit' from source: magic vars 32134 1727204431.63458: variable 'omit' from source: magic vars 32134 1727204431.63554: variable '_current_interfaces' from source: set_fact 32134 1727204431.63605: variable 'omit' from source: magic vars 32134 1727204431.63642: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204431.63673: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204431.63692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204431.63709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204431.63721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204431.63750: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204431.63754: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204431.63758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204431.63844: Set connection var ansible_timeout to 10 32134 1727204431.63858: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204431.63861: Set connection var ansible_connection to ssh 32134 1727204431.63864: Set connection var ansible_shell_type to sh 32134 1727204431.63871: Set connection var ansible_shell_executable to /bin/sh 32134 1727204431.63881: Set connection var ansible_pipelining to False 32134 1727204431.63903: variable 'ansible_shell_executable' from source: unknown 32134 1727204431.63906: variable 'ansible_connection' from source: unknown 32134 1727204431.63909: variable 'ansible_module_compression' from source: unknown 32134 1727204431.63916: variable 'ansible_shell_type' from source: unknown 32134 1727204431.63918: variable 'ansible_shell_executable' from source: unknown 32134 1727204431.63920: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204431.63926: variable 'ansible_pipelining' from source: unknown 32134 1727204431.63928: variable 'ansible_timeout' from source: unknown 32134 1727204431.63934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204431.64053: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204431.64065: variable 'omit' from source: magic vars 32134 1727204431.64072: starting attempt loop 32134 1727204431.64075: running the handler 32134 1727204431.64086: handler run complete 32134 1727204431.64098: attempt loop complete, returning result 32134 1727204431.64101: _execute() done 32134 1727204431.64105: dumping result to json 32134 1727204431.64111: done dumping result, returning 32134 1727204431.64117: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [12b410aa-8751-753f-5162-000000000193] 32134 1727204431.64123: sending task result for task 12b410aa-8751-753f-5162-000000000193 32134 1727204431.64223: done sending task result for task 12b410aa-8751-753f-5162-000000000193 32134 1727204431.64226: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 32134 1727204431.64302: no more pending results, returning what we have 32134 1727204431.64305: results queue empty 32134 1727204431.64306: checking for any_errors_fatal 32134 1727204431.64314: done checking for any_errors_fatal 32134 1727204431.64315: checking for max_fail_percentage 32134 1727204431.64316: done checking for max_fail_percentage 32134 1727204431.64317: checking to see if all hosts have failed and the running result is not ok 32134 1727204431.64318: done checking to see if all hosts have failed 32134 1727204431.64319: getting the remaining hosts for this loop 32134 1727204431.64321: done getting the remaining hosts for this loop 32134 1727204431.64324: getting the next task for host managed-node2 32134 1727204431.64333: done getting next task for host managed-node2 32134 1727204431.64336: ^ task is: TASK: Show current_interfaces 32134 1727204431.64339: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204431.64342: getting variables 32134 1727204431.64344: in VariableManager get_vars() 32134 1727204431.64376: Calling all_inventory to load vars for managed-node2 32134 1727204431.64380: Calling groups_inventory to load vars for managed-node2 32134 1727204431.64382: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204431.64402: Calling all_plugins_play to load vars for managed-node2 32134 1727204431.64406: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204431.64411: Calling groups_plugins_play to load vars for managed-node2 32134 1727204431.64650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204431.64896: done with get_vars() 32134 1727204431.64904: done getting variables 32134 1727204431.64983: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:00:31 -0400 (0:00:00.029) 0:00:06.054 ***** 32134 1727204431.65008: entering _queue_task() for managed-node2/debug 32134 1727204431.65009: Creating lock for debug 32134 1727204431.65221: worker is 1 (out of 1 available) 32134 1727204431.65235: exiting _queue_task() for managed-node2/debug 32134 1727204431.65246: done queuing things up, now waiting for results queue to drain 32134 1727204431.65248: waiting for pending results... 32134 1727204431.65509: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 32134 1727204431.65543: in run() - task 12b410aa-8751-753f-5162-000000000116 32134 1727204431.65562: variable 'ansible_search_path' from source: unknown 32134 1727204431.65569: variable 'ansible_search_path' from source: unknown 32134 1727204431.65610: calling self._execute() 32134 1727204431.65711: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204431.65794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204431.65798: variable 'omit' from source: magic vars 32134 1727204431.66119: variable 'ansible_distribution_major_version' from source: facts 32134 1727204431.66127: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204431.66134: variable 'omit' from source: magic vars 32134 1727204431.66164: variable 'omit' from source: magic vars 32134 1727204431.66248: variable 'current_interfaces' from source: set_fact 32134 1727204431.66271: variable 'omit' from source: magic vars 32134 1727204431.66308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204431.66339: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204431.66357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204431.66373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204431.66385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204431.66418: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204431.66422: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204431.66424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204431.66509: Set connection var ansible_timeout to 10 32134 1727204431.66520: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204431.66523: Set connection var ansible_connection to ssh 32134 1727204431.66527: Set connection var ansible_shell_type to sh 32134 1727204431.66534: Set connection var ansible_shell_executable to /bin/sh 32134 1727204431.66541: Set connection var ansible_pipelining to False 32134 1727204431.66559: variable 'ansible_shell_executable' from source: unknown 32134 1727204431.66563: variable 'ansible_connection' from source: unknown 32134 1727204431.66565: variable 'ansible_module_compression' from source: unknown 32134 1727204431.66570: variable 'ansible_shell_type' from source: unknown 32134 1727204431.66572: variable 'ansible_shell_executable' from source: unknown 32134 1727204431.66577: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204431.66582: variable 'ansible_pipelining' from source: unknown 32134 1727204431.66585: variable 'ansible_timeout' from source: unknown 32134 1727204431.66592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204431.66706: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204431.66717: variable 'omit' from source: magic vars 32134 1727204431.66724: starting attempt loop 32134 1727204431.66727: running the handler 32134 1727204431.66769: handler run complete 32134 1727204431.66782: attempt loop complete, returning result 32134 1727204431.66785: _execute() done 32134 1727204431.66788: dumping result to json 32134 1727204431.66795: done dumping result, returning 32134 1727204431.66803: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [12b410aa-8751-753f-5162-000000000116] 32134 1727204431.66806: sending task result for task 12b410aa-8751-753f-5162-000000000116 32134 1727204431.66898: done sending task result for task 12b410aa-8751-753f-5162-000000000116 32134 1727204431.66901: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 32134 1727204431.66952: no more pending results, returning what we have 32134 1727204431.66955: results queue empty 32134 1727204431.66957: checking for any_errors_fatal 32134 1727204431.66962: done checking for any_errors_fatal 32134 1727204431.66963: checking for max_fail_percentage 32134 1727204431.66964: done checking for max_fail_percentage 32134 1727204431.66965: checking to see if all hosts have failed and the running result is not ok 32134 1727204431.66966: done checking to see if all hosts have failed 32134 1727204431.66967: getting the remaining hosts for this loop 32134 1727204431.66969: done getting the remaining hosts for this loop 32134 1727204431.66973: getting the next task for host managed-node2 32134 1727204431.66980: done getting next task for host managed-node2 32134 1727204431.66983: ^ task is: TASK: Include the task 'manage_test_interface.yml' 32134 1727204431.66985: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204431.66990: getting variables 32134 1727204431.66992: in VariableManager get_vars() 32134 1727204431.67025: Calling all_inventory to load vars for managed-node2 32134 1727204431.67028: Calling groups_inventory to load vars for managed-node2 32134 1727204431.67031: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204431.67042: Calling all_plugins_play to load vars for managed-node2 32134 1727204431.67044: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204431.67048: Calling groups_plugins_play to load vars for managed-node2 32134 1727204431.67207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204431.67382: done with get_vars() 32134 1727204431.67395: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:16 Tuesday 24 September 2024 15:00:31 -0400 (0:00:00.024) 0:00:06.078 ***** 32134 1727204431.67464: entering _queue_task() for managed-node2/include_tasks 32134 1727204431.67668: worker is 1 (out of 1 available) 32134 1727204431.67682: exiting _queue_task() for managed-node2/include_tasks 32134 1727204431.67695: done queuing things up, now waiting for results queue to drain 32134 1727204431.67698: waiting for pending results... 32134 1727204431.67850: running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' 32134 1727204431.67916: in run() - task 12b410aa-8751-753f-5162-00000000000d 32134 1727204431.67927: variable 'ansible_search_path' from source: unknown 32134 1727204431.67960: calling self._execute() 32134 1727204431.68030: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204431.68037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204431.68048: variable 'omit' from source: magic vars 32134 1727204431.68353: variable 'ansible_distribution_major_version' from source: facts 32134 1727204431.68365: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204431.68378: _execute() done 32134 1727204431.68382: dumping result to json 32134 1727204431.68385: done dumping result, returning 32134 1727204431.68388: done running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' [12b410aa-8751-753f-5162-00000000000d] 32134 1727204431.68393: sending task result for task 12b410aa-8751-753f-5162-00000000000d 32134 1727204431.68487: done sending task result for task 12b410aa-8751-753f-5162-00000000000d 32134 1727204431.68491: WORKER PROCESS EXITING 32134 1727204431.68519: no more pending results, returning what we have 32134 1727204431.68524: in VariableManager get_vars() 32134 1727204431.68562: Calling all_inventory to load vars for managed-node2 32134 1727204431.68565: Calling groups_inventory to load vars for managed-node2 32134 1727204431.68567: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204431.68578: Calling all_plugins_play to load vars for managed-node2 32134 1727204431.68580: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204431.68584: Calling groups_plugins_play to load vars for managed-node2 32134 1727204431.68941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204431.69109: done with get_vars() 32134 1727204431.69115: variable 'ansible_search_path' from source: unknown 32134 1727204431.69125: we have included files to process 32134 1727204431.69126: generating all_blocks data 32134 1727204431.69127: done generating all_blocks data 32134 1727204431.69129: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 32134 1727204431.69130: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 32134 1727204431.69132: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 32134 1727204431.69536: in VariableManager get_vars() 32134 1727204431.69552: done with get_vars() 32134 1727204431.69740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 32134 1727204431.70204: done processing included file 32134 1727204431.70206: iterating over new_blocks loaded from include file 32134 1727204431.70207: in VariableManager get_vars() 32134 1727204431.70219: done with get_vars() 32134 1727204431.70220: filtering new block on tags 32134 1727204431.70248: done filtering new block on tags 32134 1727204431.70250: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node2 32134 1727204431.70254: extending task lists for all hosts with included blocks 32134 1727204431.71130: done extending task lists 32134 1727204431.71132: done processing included files 32134 1727204431.71132: results queue empty 32134 1727204431.71133: checking for any_errors_fatal 32134 1727204431.71135: done checking for any_errors_fatal 32134 1727204431.71135: checking for max_fail_percentage 32134 1727204431.71136: done checking for max_fail_percentage 32134 1727204431.71137: checking to see if all hosts have failed and the running result is not ok 32134 1727204431.71137: done checking to see if all hosts have failed 32134 1727204431.71138: getting the remaining hosts for this loop 32134 1727204431.71139: done getting the remaining hosts for this loop 32134 1727204431.71140: getting the next task for host managed-node2 32134 1727204431.71144: done getting next task for host managed-node2 32134 1727204431.71146: ^ task is: TASK: Ensure state in ["present", "absent"] 32134 1727204431.71147: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204431.71149: getting variables 32134 1727204431.71150: in VariableManager get_vars() 32134 1727204431.71158: Calling all_inventory to load vars for managed-node2 32134 1727204431.71159: Calling groups_inventory to load vars for managed-node2 32134 1727204431.71161: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204431.71165: Calling all_plugins_play to load vars for managed-node2 32134 1727204431.71167: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204431.71169: Calling groups_plugins_play to load vars for managed-node2 32134 1727204431.71305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204431.71480: done with get_vars() 32134 1727204431.71487: done getting variables 32134 1727204431.71540: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:00:31 -0400 (0:00:00.040) 0:00:06.119 ***** 32134 1727204431.71562: entering _queue_task() for managed-node2/fail 32134 1727204431.71564: Creating lock for fail 32134 1727204431.71773: worker is 1 (out of 1 available) 32134 1727204431.71786: exiting _queue_task() for managed-node2/fail 32134 1727204431.71805: done queuing things up, now waiting for results queue to drain 32134 1727204431.71807: waiting for pending results... 32134 1727204431.71988: running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] 32134 1727204431.72064: in run() - task 12b410aa-8751-753f-5162-0000000001ae 32134 1727204431.72076: variable 'ansible_search_path' from source: unknown 32134 1727204431.72080: variable 'ansible_search_path' from source: unknown 32134 1727204431.72295: calling self._execute() 32134 1727204431.72303: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204431.72306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204431.72308: variable 'omit' from source: magic vars 32134 1727204431.72628: variable 'ansible_distribution_major_version' from source: facts 32134 1727204431.72646: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204431.72810: variable 'state' from source: include params 32134 1727204431.72823: Evaluated conditional (state not in ["present", "absent"]): False 32134 1727204431.72830: when evaluation is False, skipping this task 32134 1727204431.72837: _execute() done 32134 1727204431.72845: dumping result to json 32134 1727204431.72853: done dumping result, returning 32134 1727204431.72862: done running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] [12b410aa-8751-753f-5162-0000000001ae] 32134 1727204431.72870: sending task result for task 12b410aa-8751-753f-5162-0000000001ae skipping: [managed-node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 32134 1727204431.73021: no more pending results, returning what we have 32134 1727204431.73026: results queue empty 32134 1727204431.73028: checking for any_errors_fatal 32134 1727204431.73029: done checking for any_errors_fatal 32134 1727204431.73030: checking for max_fail_percentage 32134 1727204431.73031: done checking for max_fail_percentage 32134 1727204431.73032: checking to see if all hosts have failed and the running result is not ok 32134 1727204431.73033: done checking to see if all hosts have failed 32134 1727204431.73034: getting the remaining hosts for this loop 32134 1727204431.73036: done getting the remaining hosts for this loop 32134 1727204431.73040: getting the next task for host managed-node2 32134 1727204431.73047: done getting next task for host managed-node2 32134 1727204431.73050: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 32134 1727204431.73053: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204431.73063: getting variables 32134 1727204431.73065: in VariableManager get_vars() 32134 1727204431.73105: Calling all_inventory to load vars for managed-node2 32134 1727204431.73109: Calling groups_inventory to load vars for managed-node2 32134 1727204431.73112: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204431.73126: Calling all_plugins_play to load vars for managed-node2 32134 1727204431.73129: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204431.73134: Calling groups_plugins_play to load vars for managed-node2 32134 1727204431.73698: done sending task result for task 12b410aa-8751-753f-5162-0000000001ae 32134 1727204431.73702: WORKER PROCESS EXITING 32134 1727204431.73784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204431.74552: done with get_vars() 32134 1727204431.74565: done getting variables 32134 1727204431.74804: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:00:31 -0400 (0:00:00.032) 0:00:06.152 ***** 32134 1727204431.74849: entering _queue_task() for managed-node2/fail 32134 1727204431.75458: worker is 1 (out of 1 available) 32134 1727204431.75473: exiting _queue_task() for managed-node2/fail 32134 1727204431.75486: done queuing things up, now waiting for results queue to drain 32134 1727204431.75488: waiting for pending results... 32134 1727204431.75767: running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] 32134 1727204431.75933: in run() - task 12b410aa-8751-753f-5162-0000000001af 32134 1727204431.75936: variable 'ansible_search_path' from source: unknown 32134 1727204431.75939: variable 'ansible_search_path' from source: unknown 32134 1727204431.75960: calling self._execute() 32134 1727204431.76063: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204431.76068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204431.76240: variable 'omit' from source: magic vars 32134 1727204431.76943: variable 'ansible_distribution_major_version' from source: facts 32134 1727204431.76947: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204431.77276: variable 'type' from source: set_fact 32134 1727204431.77304: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 32134 1727204431.77316: when evaluation is False, skipping this task 32134 1727204431.77596: _execute() done 32134 1727204431.77600: dumping result to json 32134 1727204431.77603: done dumping result, returning 32134 1727204431.77605: done running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] [12b410aa-8751-753f-5162-0000000001af] 32134 1727204431.77608: sending task result for task 12b410aa-8751-753f-5162-0000000001af 32134 1727204431.77687: done sending task result for task 12b410aa-8751-753f-5162-0000000001af 32134 1727204431.77694: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 32134 1727204431.77748: no more pending results, returning what we have 32134 1727204431.77753: results queue empty 32134 1727204431.77754: checking for any_errors_fatal 32134 1727204431.77761: done checking for any_errors_fatal 32134 1727204431.77762: checking for max_fail_percentage 32134 1727204431.77764: done checking for max_fail_percentage 32134 1727204431.77765: checking to see if all hosts have failed and the running result is not ok 32134 1727204431.77766: done checking to see if all hosts have failed 32134 1727204431.77766: getting the remaining hosts for this loop 32134 1727204431.77768: done getting the remaining hosts for this loop 32134 1727204431.77773: getting the next task for host managed-node2 32134 1727204431.77781: done getting next task for host managed-node2 32134 1727204431.77783: ^ task is: TASK: Include the task 'show_interfaces.yml' 32134 1727204431.77787: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204431.77794: getting variables 32134 1727204431.77795: in VariableManager get_vars() 32134 1727204431.77834: Calling all_inventory to load vars for managed-node2 32134 1727204431.77838: Calling groups_inventory to load vars for managed-node2 32134 1727204431.77840: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204431.77854: Calling all_plugins_play to load vars for managed-node2 32134 1727204431.77857: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204431.77860: Calling groups_plugins_play to load vars for managed-node2 32134 1727204431.78323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204431.78815: done with get_vars() 32134 1727204431.78829: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:00:31 -0400 (0:00:00.040) 0:00:06.193 ***** 32134 1727204431.78944: entering _queue_task() for managed-node2/include_tasks 32134 1727204431.79219: worker is 1 (out of 1 available) 32134 1727204431.79234: exiting _queue_task() for managed-node2/include_tasks 32134 1727204431.79247: done queuing things up, now waiting for results queue to drain 32134 1727204431.79249: waiting for pending results... 32134 1727204431.79537: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 32134 1727204431.79668: in run() - task 12b410aa-8751-753f-5162-0000000001b0 32134 1727204431.79696: variable 'ansible_search_path' from source: unknown 32134 1727204431.79708: variable 'ansible_search_path' from source: unknown 32134 1727204431.79897: calling self._execute() 32134 1727204431.79901: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204431.79904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204431.79907: variable 'omit' from source: magic vars 32134 1727204431.80362: variable 'ansible_distribution_major_version' from source: facts 32134 1727204431.80383: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204431.80400: _execute() done 32134 1727204431.80409: dumping result to json 32134 1727204431.80422: done dumping result, returning 32134 1727204431.80433: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-753f-5162-0000000001b0] 32134 1727204431.80443: sending task result for task 12b410aa-8751-753f-5162-0000000001b0 32134 1727204431.80596: no more pending results, returning what we have 32134 1727204431.80603: in VariableManager get_vars() 32134 1727204431.80651: Calling all_inventory to load vars for managed-node2 32134 1727204431.80655: Calling groups_inventory to load vars for managed-node2 32134 1727204431.80659: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204431.80676: Calling all_plugins_play to load vars for managed-node2 32134 1727204431.80680: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204431.80685: Calling groups_plugins_play to load vars for managed-node2 32134 1727204431.81295: done sending task result for task 12b410aa-8751-753f-5162-0000000001b0 32134 1727204431.81299: WORKER PROCESS EXITING 32134 1727204431.81331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204431.81657: done with get_vars() 32134 1727204431.81667: variable 'ansible_search_path' from source: unknown 32134 1727204431.81668: variable 'ansible_search_path' from source: unknown 32134 1727204431.81715: we have included files to process 32134 1727204431.81717: generating all_blocks data 32134 1727204431.81718: done generating all_blocks data 32134 1727204431.81722: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32134 1727204431.81724: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32134 1727204431.81727: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32134 1727204431.81851: in VariableManager get_vars() 32134 1727204431.81879: done with get_vars() 32134 1727204431.82018: done processing included file 32134 1727204431.82021: iterating over new_blocks loaded from include file 32134 1727204431.82022: in VariableManager get_vars() 32134 1727204431.82042: done with get_vars() 32134 1727204431.82044: filtering new block on tags 32134 1727204431.82067: done filtering new block on tags 32134 1727204431.82070: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 32134 1727204431.82076: extending task lists for all hosts with included blocks 32134 1727204431.82629: done extending task lists 32134 1727204431.82630: done processing included files 32134 1727204431.82631: results queue empty 32134 1727204431.82632: checking for any_errors_fatal 32134 1727204431.82636: done checking for any_errors_fatal 32134 1727204431.82637: checking for max_fail_percentage 32134 1727204431.82638: done checking for max_fail_percentage 32134 1727204431.82639: checking to see if all hosts have failed and the running result is not ok 32134 1727204431.82640: done checking to see if all hosts have failed 32134 1727204431.82641: getting the remaining hosts for this loop 32134 1727204431.82643: done getting the remaining hosts for this loop 32134 1727204431.82646: getting the next task for host managed-node2 32134 1727204431.82651: done getting next task for host managed-node2 32134 1727204431.82653: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 32134 1727204431.82656: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204431.82659: getting variables 32134 1727204431.82660: in VariableManager get_vars() 32134 1727204431.82672: Calling all_inventory to load vars for managed-node2 32134 1727204431.82675: Calling groups_inventory to load vars for managed-node2 32134 1727204431.82677: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204431.82683: Calling all_plugins_play to load vars for managed-node2 32134 1727204431.82686: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204431.82692: Calling groups_plugins_play to load vars for managed-node2 32134 1727204431.82946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204431.83272: done with get_vars() 32134 1727204431.83283: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:00:31 -0400 (0:00:00.044) 0:00:06.237 ***** 32134 1727204431.83373: entering _queue_task() for managed-node2/include_tasks 32134 1727204431.83682: worker is 1 (out of 1 available) 32134 1727204431.83897: exiting _queue_task() for managed-node2/include_tasks 32134 1727204431.83908: done queuing things up, now waiting for results queue to drain 32134 1727204431.83909: waiting for pending results... 32134 1727204431.84022: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 32134 1727204431.84169: in run() - task 12b410aa-8751-753f-5162-000000000245 32134 1727204431.84194: variable 'ansible_search_path' from source: unknown 32134 1727204431.84204: variable 'ansible_search_path' from source: unknown 32134 1727204431.84256: calling self._execute() 32134 1727204431.84358: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204431.84373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204431.84388: variable 'omit' from source: magic vars 32134 1727204431.84827: variable 'ansible_distribution_major_version' from source: facts 32134 1727204431.84846: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204431.84858: _execute() done 32134 1727204431.84866: dumping result to json 32134 1727204431.84876: done dumping result, returning 32134 1727204431.84888: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-753f-5162-000000000245] 32134 1727204431.84902: sending task result for task 12b410aa-8751-753f-5162-000000000245 32134 1727204431.85036: no more pending results, returning what we have 32134 1727204431.85043: in VariableManager get_vars() 32134 1727204431.85088: Calling all_inventory to load vars for managed-node2 32134 1727204431.85094: Calling groups_inventory to load vars for managed-node2 32134 1727204431.85097: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204431.85119: Calling all_plugins_play to load vars for managed-node2 32134 1727204431.85123: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204431.85127: Calling groups_plugins_play to load vars for managed-node2 32134 1727204431.85601: done sending task result for task 12b410aa-8751-753f-5162-000000000245 32134 1727204431.85605: WORKER PROCESS EXITING 32134 1727204431.85637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204431.86004: done with get_vars() 32134 1727204431.86018: variable 'ansible_search_path' from source: unknown 32134 1727204431.86019: variable 'ansible_search_path' from source: unknown 32134 1727204431.86093: we have included files to process 32134 1727204431.86094: generating all_blocks data 32134 1727204431.86096: done generating all_blocks data 32134 1727204431.86098: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32134 1727204431.86099: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32134 1727204431.86102: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32134 1727204431.86440: done processing included file 32134 1727204431.86443: iterating over new_blocks loaded from include file 32134 1727204431.86445: in VariableManager get_vars() 32134 1727204431.86467: done with get_vars() 32134 1727204431.86469: filtering new block on tags 32134 1727204431.86496: done filtering new block on tags 32134 1727204431.86499: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 32134 1727204431.86505: extending task lists for all hosts with included blocks 32134 1727204431.86720: done extending task lists 32134 1727204431.86722: done processing included files 32134 1727204431.86723: results queue empty 32134 1727204431.86724: checking for any_errors_fatal 32134 1727204431.86728: done checking for any_errors_fatal 32134 1727204431.86729: checking for max_fail_percentage 32134 1727204431.86730: done checking for max_fail_percentage 32134 1727204431.86731: checking to see if all hosts have failed and the running result is not ok 32134 1727204431.86732: done checking to see if all hosts have failed 32134 1727204431.86733: getting the remaining hosts for this loop 32134 1727204431.86735: done getting the remaining hosts for this loop 32134 1727204431.86738: getting the next task for host managed-node2 32134 1727204431.86745: done getting next task for host managed-node2 32134 1727204431.86747: ^ task is: TASK: Gather current interface info 32134 1727204431.86751: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204431.86754: getting variables 32134 1727204431.86755: in VariableManager get_vars() 32134 1727204431.86768: Calling all_inventory to load vars for managed-node2 32134 1727204431.86771: Calling groups_inventory to load vars for managed-node2 32134 1727204431.86774: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204431.86780: Calling all_plugins_play to load vars for managed-node2 32134 1727204431.86783: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204431.86786: Calling groups_plugins_play to load vars for managed-node2 32134 1727204431.87042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204431.87358: done with get_vars() 32134 1727204431.87369: done getting variables 32134 1727204431.87425: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:00:31 -0400 (0:00:00.040) 0:00:06.278 ***** 32134 1727204431.87462: entering _queue_task() for managed-node2/command 32134 1727204431.87765: worker is 1 (out of 1 available) 32134 1727204431.87778: exiting _queue_task() for managed-node2/command 32134 1727204431.87794: done queuing things up, now waiting for results queue to drain 32134 1727204431.87796: waiting for pending results... 32134 1727204431.88214: running TaskExecutor() for managed-node2/TASK: Gather current interface info 32134 1727204431.88222: in run() - task 12b410aa-8751-753f-5162-00000000027c 32134 1727204431.88242: variable 'ansible_search_path' from source: unknown 32134 1727204431.88251: variable 'ansible_search_path' from source: unknown 32134 1727204431.88294: calling self._execute() 32134 1727204431.88397: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204431.88414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204431.88433: variable 'omit' from source: magic vars 32134 1727204431.88871: variable 'ansible_distribution_major_version' from source: facts 32134 1727204431.88891: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204431.88903: variable 'omit' from source: magic vars 32134 1727204431.88985: variable 'omit' from source: magic vars 32134 1727204431.89037: variable 'omit' from source: magic vars 32134 1727204431.89092: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204431.89182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204431.89185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204431.89198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204431.89219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204431.89260: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204431.89270: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204431.89278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204431.89416: Set connection var ansible_timeout to 10 32134 1727204431.89507: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204431.89510: Set connection var ansible_connection to ssh 32134 1727204431.89515: Set connection var ansible_shell_type to sh 32134 1727204431.89518: Set connection var ansible_shell_executable to /bin/sh 32134 1727204431.89520: Set connection var ansible_pipelining to False 32134 1727204431.89522: variable 'ansible_shell_executable' from source: unknown 32134 1727204431.89524: variable 'ansible_connection' from source: unknown 32134 1727204431.89527: variable 'ansible_module_compression' from source: unknown 32134 1727204431.89529: variable 'ansible_shell_type' from source: unknown 32134 1727204431.89531: variable 'ansible_shell_executable' from source: unknown 32134 1727204431.89535: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204431.89546: variable 'ansible_pipelining' from source: unknown 32134 1727204431.89554: variable 'ansible_timeout' from source: unknown 32134 1727204431.89563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204431.89740: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204431.89758: variable 'omit' from source: magic vars 32134 1727204431.89769: starting attempt loop 32134 1727204431.89776: running the handler 32134 1727204431.89835: _low_level_execute_command(): starting 32134 1727204431.89839: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204431.90556: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204431.90608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204431.90684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204431.90699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204431.90744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204431.90784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204431.92588: stdout chunk (state=3): >>>/root <<< 32134 1727204431.92876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204431.92879: stdout chunk (state=3): >>><<< 32134 1727204431.92882: stderr chunk (state=3): >>><<< 32134 1727204431.92906: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204431.92929: _low_level_execute_command(): starting 32134 1727204431.92957: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204431.9291356-32501-269403155177096 `" && echo ansible-tmp-1727204431.9291356-32501-269403155177096="` echo /root/.ansible/tmp/ansible-tmp-1727204431.9291356-32501-269403155177096 `" ) && sleep 0' 32134 1727204431.93809: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204431.93894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204431.93920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204431.93947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204431.94025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204431.96118: stdout chunk (state=3): >>>ansible-tmp-1727204431.9291356-32501-269403155177096=/root/.ansible/tmp/ansible-tmp-1727204431.9291356-32501-269403155177096 <<< 32134 1727204431.96338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204431.96341: stdout chunk (state=3): >>><<< 32134 1727204431.96345: stderr chunk (state=3): >>><<< 32134 1727204431.96366: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204431.9291356-32501-269403155177096=/root/.ansible/tmp/ansible-tmp-1727204431.9291356-32501-269403155177096 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204431.96426: variable 'ansible_module_compression' from source: unknown 32134 1727204431.96496: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32134 1727204431.96557: variable 'ansible_facts' from source: unknown 32134 1727204431.96696: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204431.9291356-32501-269403155177096/AnsiballZ_command.py 32134 1727204431.96835: Sending initial data 32134 1727204431.96850: Sent initial data (156 bytes) 32134 1727204431.97688: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204431.97782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204431.97814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204431.98041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204431.98116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204431.99801: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204431.99918: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204431.99922: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmp5qxulc66 /root/.ansible/tmp/ansible-tmp-1727204431.9291356-32501-269403155177096/AnsiballZ_command.py <<< 32134 1727204431.99926: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204431.9291356-32501-269403155177096/AnsiballZ_command.py" <<< 32134 1727204431.99929: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmp5qxulc66" to remote "/root/.ansible/tmp/ansible-tmp-1727204431.9291356-32501-269403155177096/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204431.9291356-32501-269403155177096/AnsiballZ_command.py" <<< 32134 1727204432.01491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204432.01502: stderr chunk (state=3): >>><<< 32134 1727204432.01507: stdout chunk (state=3): >>><<< 32134 1727204432.01799: done transferring module to remote 32134 1727204432.01803: _low_level_execute_command(): starting 32134 1727204432.01807: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204431.9291356-32501-269403155177096/ /root/.ansible/tmp/ansible-tmp-1727204431.9291356-32501-269403155177096/AnsiballZ_command.py && sleep 0' 32134 1727204432.03060: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204432.03102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204432.03136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204432.03201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204432.03246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204432.05417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204432.05463: stdout chunk (state=3): >>><<< 32134 1727204432.05469: stderr chunk (state=3): >>><<< 32134 1727204432.05827: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204432.05835: _low_level_execute_command(): starting 32134 1727204432.05838: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204431.9291356-32501-269403155177096/AnsiballZ_command.py && sleep 0' 32134 1727204432.06674: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204432.06681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204432.06705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204432.06712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204432.06838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204432.06950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204432.06965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204432.07169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204432.07234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204432.25133: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:32.246684", "end": "2024-09-24 15:00:32.250400", "delta": "0:00:00.003716", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32134 1727204432.26848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204432.26908: stderr chunk (state=3): >>><<< 32134 1727204432.26912: stdout chunk (state=3): >>><<< 32134 1727204432.26932: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:32.246684", "end": "2024-09-24 15:00:32.250400", "delta": "0:00:00.003716", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204432.26975: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204431.9291356-32501-269403155177096/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204432.26984: _low_level_execute_command(): starting 32134 1727204432.26992: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204431.9291356-32501-269403155177096/ > /dev/null 2>&1 && sleep 0' 32134 1727204432.27455: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204432.27491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204432.27496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204432.27498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204432.27502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204432.27505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204432.27558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204432.27561: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204432.27611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204432.29607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204432.29664: stderr chunk (state=3): >>><<< 32134 1727204432.29669: stdout chunk (state=3): >>><<< 32134 1727204432.29680: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204432.29687: handler run complete 32134 1727204432.29716: Evaluated conditional (False): False 32134 1727204432.29725: attempt loop complete, returning result 32134 1727204432.29728: _execute() done 32134 1727204432.29733: dumping result to json 32134 1727204432.29739: done dumping result, returning 32134 1727204432.29747: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [12b410aa-8751-753f-5162-00000000027c] 32134 1727204432.29753: sending task result for task 12b410aa-8751-753f-5162-00000000027c 32134 1727204432.29866: done sending task result for task 12b410aa-8751-753f-5162-00000000027c 32134 1727204432.29869: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003716", "end": "2024-09-24 15:00:32.250400", "rc": 0, "start": "2024-09-24 15:00:32.246684" } STDOUT: bonding_masters eth0 lo 32134 1727204432.29967: no more pending results, returning what we have 32134 1727204432.29970: results queue empty 32134 1727204432.29972: checking for any_errors_fatal 32134 1727204432.29973: done checking for any_errors_fatal 32134 1727204432.29974: checking for max_fail_percentage 32134 1727204432.29975: done checking for max_fail_percentage 32134 1727204432.29976: checking to see if all hosts have failed and the running result is not ok 32134 1727204432.29977: done checking to see if all hosts have failed 32134 1727204432.29978: getting the remaining hosts for this loop 32134 1727204432.29980: done getting the remaining hosts for this loop 32134 1727204432.29984: getting the next task for host managed-node2 32134 1727204432.29993: done getting next task for host managed-node2 32134 1727204432.29996: ^ task is: TASK: Set current_interfaces 32134 1727204432.30001: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204432.30005: getting variables 32134 1727204432.30006: in VariableManager get_vars() 32134 1727204432.30044: Calling all_inventory to load vars for managed-node2 32134 1727204432.30047: Calling groups_inventory to load vars for managed-node2 32134 1727204432.30050: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204432.30061: Calling all_plugins_play to load vars for managed-node2 32134 1727204432.30064: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204432.30067: Calling groups_plugins_play to load vars for managed-node2 32134 1727204432.30247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204432.30459: done with get_vars() 32134 1727204432.30469: done getting variables 32134 1727204432.30520: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:00:32 -0400 (0:00:00.430) 0:00:06.709 ***** 32134 1727204432.30548: entering _queue_task() for managed-node2/set_fact 32134 1727204432.30819: worker is 1 (out of 1 available) 32134 1727204432.30832: exiting _queue_task() for managed-node2/set_fact 32134 1727204432.30845: done queuing things up, now waiting for results queue to drain 32134 1727204432.30846: waiting for pending results... 32134 1727204432.31307: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 32134 1727204432.31311: in run() - task 12b410aa-8751-753f-5162-00000000027d 32134 1727204432.31317: variable 'ansible_search_path' from source: unknown 32134 1727204432.31320: variable 'ansible_search_path' from source: unknown 32134 1727204432.31362: calling self._execute() 32134 1727204432.31465: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204432.31478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204432.31496: variable 'omit' from source: magic vars 32134 1727204432.31958: variable 'ansible_distribution_major_version' from source: facts 32134 1727204432.31978: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204432.31992: variable 'omit' from source: magic vars 32134 1727204432.32042: variable 'omit' from source: magic vars 32134 1727204432.32145: variable '_current_interfaces' from source: set_fact 32134 1727204432.32201: variable 'omit' from source: magic vars 32134 1727204432.32239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204432.32268: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204432.32285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204432.32308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204432.32320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204432.32349: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204432.32352: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204432.32356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204432.32444: Set connection var ansible_timeout to 10 32134 1727204432.32456: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204432.32459: Set connection var ansible_connection to ssh 32134 1727204432.32462: Set connection var ansible_shell_type to sh 32134 1727204432.32469: Set connection var ansible_shell_executable to /bin/sh 32134 1727204432.32475: Set connection var ansible_pipelining to False 32134 1727204432.32495: variable 'ansible_shell_executable' from source: unknown 32134 1727204432.32498: variable 'ansible_connection' from source: unknown 32134 1727204432.32501: variable 'ansible_module_compression' from source: unknown 32134 1727204432.32505: variable 'ansible_shell_type' from source: unknown 32134 1727204432.32508: variable 'ansible_shell_executable' from source: unknown 32134 1727204432.32513: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204432.32524: variable 'ansible_pipelining' from source: unknown 32134 1727204432.32526: variable 'ansible_timeout' from source: unknown 32134 1727204432.32530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204432.32651: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204432.32661: variable 'omit' from source: magic vars 32134 1727204432.32667: starting attempt loop 32134 1727204432.32671: running the handler 32134 1727204432.32683: handler run complete 32134 1727204432.32692: attempt loop complete, returning result 32134 1727204432.32699: _execute() done 32134 1727204432.32703: dumping result to json 32134 1727204432.32709: done dumping result, returning 32134 1727204432.32718: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [12b410aa-8751-753f-5162-00000000027d] 32134 1727204432.32723: sending task result for task 12b410aa-8751-753f-5162-00000000027d 32134 1727204432.32813: done sending task result for task 12b410aa-8751-753f-5162-00000000027d 32134 1727204432.32816: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 32134 1727204432.32884: no more pending results, returning what we have 32134 1727204432.32887: results queue empty 32134 1727204432.32890: checking for any_errors_fatal 32134 1727204432.32899: done checking for any_errors_fatal 32134 1727204432.32900: checking for max_fail_percentage 32134 1727204432.32901: done checking for max_fail_percentage 32134 1727204432.32902: checking to see if all hosts have failed and the running result is not ok 32134 1727204432.32903: done checking to see if all hosts have failed 32134 1727204432.32904: getting the remaining hosts for this loop 32134 1727204432.32908: done getting the remaining hosts for this loop 32134 1727204432.32913: getting the next task for host managed-node2 32134 1727204432.32922: done getting next task for host managed-node2 32134 1727204432.32926: ^ task is: TASK: Show current_interfaces 32134 1727204432.32930: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204432.32934: getting variables 32134 1727204432.32935: in VariableManager get_vars() 32134 1727204432.32968: Calling all_inventory to load vars for managed-node2 32134 1727204432.32970: Calling groups_inventory to load vars for managed-node2 32134 1727204432.32973: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204432.32983: Calling all_plugins_play to load vars for managed-node2 32134 1727204432.32986: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204432.33012: Calling groups_plugins_play to load vars for managed-node2 32134 1727204432.33169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204432.33356: done with get_vars() 32134 1727204432.33371: done getting variables 32134 1727204432.33419: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:00:32 -0400 (0:00:00.028) 0:00:06.738 ***** 32134 1727204432.33446: entering _queue_task() for managed-node2/debug 32134 1727204432.33652: worker is 1 (out of 1 available) 32134 1727204432.33666: exiting _queue_task() for managed-node2/debug 32134 1727204432.33678: done queuing things up, now waiting for results queue to drain 32134 1727204432.33680: waiting for pending results... 32134 1727204432.34002: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 32134 1727204432.34298: in run() - task 12b410aa-8751-753f-5162-000000000246 32134 1727204432.34303: variable 'ansible_search_path' from source: unknown 32134 1727204432.34306: variable 'ansible_search_path' from source: unknown 32134 1727204432.34310: calling self._execute() 32134 1727204432.34698: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204432.34702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204432.34705: variable 'omit' from source: magic vars 32134 1727204432.35550: variable 'ansible_distribution_major_version' from source: facts 32134 1727204432.35570: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204432.35584: variable 'omit' from source: magic vars 32134 1727204432.35655: variable 'omit' from source: magic vars 32134 1727204432.35788: variable 'current_interfaces' from source: set_fact 32134 1727204432.35848: variable 'omit' from source: magic vars 32134 1727204432.35900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204432.35953: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204432.35984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204432.36014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204432.36063: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204432.36085: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204432.36099: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204432.36110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204432.36245: Set connection var ansible_timeout to 10 32134 1727204432.36281: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204432.36347: Set connection var ansible_connection to ssh 32134 1727204432.36355: Set connection var ansible_shell_type to sh 32134 1727204432.36358: Set connection var ansible_shell_executable to /bin/sh 32134 1727204432.36360: Set connection var ansible_pipelining to False 32134 1727204432.36362: variable 'ansible_shell_executable' from source: unknown 32134 1727204432.36364: variable 'ansible_connection' from source: unknown 32134 1727204432.36366: variable 'ansible_module_compression' from source: unknown 32134 1727204432.36368: variable 'ansible_shell_type' from source: unknown 32134 1727204432.36377: variable 'ansible_shell_executable' from source: unknown 32134 1727204432.36391: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204432.36403: variable 'ansible_pipelining' from source: unknown 32134 1727204432.36411: variable 'ansible_timeout' from source: unknown 32134 1727204432.36420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204432.36716: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204432.36742: variable 'omit' from source: magic vars 32134 1727204432.36754: starting attempt loop 32134 1727204432.36761: running the handler 32134 1727204432.36826: handler run complete 32134 1727204432.36851: attempt loop complete, returning result 32134 1727204432.36896: _execute() done 32134 1727204432.36900: dumping result to json 32134 1727204432.36902: done dumping result, returning 32134 1727204432.36905: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [12b410aa-8751-753f-5162-000000000246] 32134 1727204432.36907: sending task result for task 12b410aa-8751-753f-5162-000000000246 ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 32134 1727204432.37173: no more pending results, returning what we have 32134 1727204432.37177: results queue empty 32134 1727204432.37178: checking for any_errors_fatal 32134 1727204432.37183: done checking for any_errors_fatal 32134 1727204432.37184: checking for max_fail_percentage 32134 1727204432.37186: done checking for max_fail_percentage 32134 1727204432.37186: checking to see if all hosts have failed and the running result is not ok 32134 1727204432.37188: done checking to see if all hosts have failed 32134 1727204432.37195: getting the remaining hosts for this loop 32134 1727204432.37197: done getting the remaining hosts for this loop 32134 1727204432.37203: getting the next task for host managed-node2 32134 1727204432.37212: done getting next task for host managed-node2 32134 1727204432.37216: ^ task is: TASK: Install iproute 32134 1727204432.37220: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204432.37224: getting variables 32134 1727204432.37226: in VariableManager get_vars() 32134 1727204432.37267: Calling all_inventory to load vars for managed-node2 32134 1727204432.37270: Calling groups_inventory to load vars for managed-node2 32134 1727204432.37274: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204432.37286: Calling all_plugins_play to load vars for managed-node2 32134 1727204432.37416: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204432.37423: Calling groups_plugins_play to load vars for managed-node2 32134 1727204432.37790: done sending task result for task 12b410aa-8751-753f-5162-000000000246 32134 1727204432.37794: WORKER PROCESS EXITING 32134 1727204432.37822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204432.38461: done with get_vars() 32134 1727204432.38474: done getting variables 32134 1727204432.38566: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:00:32 -0400 (0:00:00.051) 0:00:06.789 ***** 32134 1727204432.38603: entering _queue_task() for managed-node2/package 32134 1727204432.38982: worker is 1 (out of 1 available) 32134 1727204432.38997: exiting _queue_task() for managed-node2/package 32134 1727204432.39010: done queuing things up, now waiting for results queue to drain 32134 1727204432.39011: waiting for pending results... 32134 1727204432.39367: running TaskExecutor() for managed-node2/TASK: Install iproute 32134 1727204432.39493: in run() - task 12b410aa-8751-753f-5162-0000000001b1 32134 1727204432.39513: variable 'ansible_search_path' from source: unknown 32134 1727204432.39522: variable 'ansible_search_path' from source: unknown 32134 1727204432.39563: calling self._execute() 32134 1727204432.39662: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204432.39678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204432.39705: variable 'omit' from source: magic vars 32134 1727204432.40132: variable 'ansible_distribution_major_version' from source: facts 32134 1727204432.40157: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204432.40195: variable 'omit' from source: magic vars 32134 1727204432.40225: variable 'omit' from source: magic vars 32134 1727204432.40483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204432.43071: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204432.43193: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204432.43234: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204432.43298: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204432.43322: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204432.43446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204432.43520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204432.43533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204432.43691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204432.43697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204432.43748: variable '__network_is_ostree' from source: set_fact 32134 1727204432.43759: variable 'omit' from source: magic vars 32134 1727204432.43793: variable 'omit' from source: magic vars 32134 1727204432.43834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204432.43875: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204432.43904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204432.43938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204432.43954: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204432.44031: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204432.44034: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204432.44036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204432.44127: Set connection var ansible_timeout to 10 32134 1727204432.44154: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204432.44161: Set connection var ansible_connection to ssh 32134 1727204432.44167: Set connection var ansible_shell_type to sh 32134 1727204432.44176: Set connection var ansible_shell_executable to /bin/sh 32134 1727204432.44187: Set connection var ansible_pipelining to False 32134 1727204432.44297: variable 'ansible_shell_executable' from source: unknown 32134 1727204432.44300: variable 'ansible_connection' from source: unknown 32134 1727204432.44303: variable 'ansible_module_compression' from source: unknown 32134 1727204432.44305: variable 'ansible_shell_type' from source: unknown 32134 1727204432.44307: variable 'ansible_shell_executable' from source: unknown 32134 1727204432.44394: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204432.44397: variable 'ansible_pipelining' from source: unknown 32134 1727204432.44399: variable 'ansible_timeout' from source: unknown 32134 1727204432.44403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204432.44700: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204432.44704: variable 'omit' from source: magic vars 32134 1727204432.44707: starting attempt loop 32134 1727204432.44709: running the handler 32134 1727204432.44711: variable 'ansible_facts' from source: unknown 32134 1727204432.44716: variable 'ansible_facts' from source: unknown 32134 1727204432.44718: _low_level_execute_command(): starting 32134 1727204432.44721: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204432.45357: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204432.45379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204432.45492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204432.45522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204432.45608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204432.47369: stdout chunk (state=3): >>>/root <<< 32134 1727204432.47514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204432.47571: stderr chunk (state=3): >>><<< 32134 1727204432.47591: stdout chunk (state=3): >>><<< 32134 1727204432.47633: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204432.47741: _low_level_execute_command(): starting 32134 1727204432.47746: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204432.4764197-32627-30209274948540 `" && echo ansible-tmp-1727204432.4764197-32627-30209274948540="` echo /root/.ansible/tmp/ansible-tmp-1727204432.4764197-32627-30209274948540 `" ) && sleep 0' 32134 1727204432.48307: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204432.48325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204432.48404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204432.48465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204432.48488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204432.48508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204432.48588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204432.50716: stdout chunk (state=3): >>>ansible-tmp-1727204432.4764197-32627-30209274948540=/root/.ansible/tmp/ansible-tmp-1727204432.4764197-32627-30209274948540 <<< 32134 1727204432.50838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204432.50841: stdout chunk (state=3): >>><<< 32134 1727204432.50843: stderr chunk (state=3): >>><<< 32134 1727204432.51068: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204432.4764197-32627-30209274948540=/root/.ansible/tmp/ansible-tmp-1727204432.4764197-32627-30209274948540 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204432.51071: variable 'ansible_module_compression' from source: unknown 32134 1727204432.51103: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 32134 1727204432.51111: ANSIBALLZ: Acquiring lock 32134 1727204432.51121: ANSIBALLZ: Lock acquired: 140589353832608 32134 1727204432.51128: ANSIBALLZ: Creating module 32134 1727204432.75560: ANSIBALLZ: Writing module into payload 32134 1727204432.75923: ANSIBALLZ: Writing module 32134 1727204432.75955: ANSIBALLZ: Renaming module 32134 1727204432.75968: ANSIBALLZ: Done creating module 32134 1727204432.76003: variable 'ansible_facts' from source: unknown 32134 1727204432.76164: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204432.4764197-32627-30209274948540/AnsiballZ_dnf.py 32134 1727204432.76429: Sending initial data 32134 1727204432.76438: Sent initial data (151 bytes) 32134 1727204432.77545: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204432.77704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204432.77726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204432.77759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204432.77832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204432.79730: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204432.79764: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204432.79870: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmp6arnzicn /root/.ansible/tmp/ansible-tmp-1727204432.4764197-32627-30209274948540/AnsiballZ_dnf.py <<< 32134 1727204432.79874: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204432.4764197-32627-30209274948540/AnsiballZ_dnf.py" <<< 32134 1727204432.79955: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmp6arnzicn" to remote "/root/.ansible/tmp/ansible-tmp-1727204432.4764197-32627-30209274948540/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204432.4764197-32627-30209274948540/AnsiballZ_dnf.py" <<< 32134 1727204432.82603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204432.82607: stdout chunk (state=3): >>><<< 32134 1727204432.82610: stderr chunk (state=3): >>><<< 32134 1727204432.82612: done transferring module to remote 32134 1727204432.82614: _low_level_execute_command(): starting 32134 1727204432.82617: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204432.4764197-32627-30209274948540/ /root/.ansible/tmp/ansible-tmp-1727204432.4764197-32627-30209274948540/AnsiballZ_dnf.py && sleep 0' 32134 1727204432.84109: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204432.84207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204432.84260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204432.84380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204432.84521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204432.86559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204432.86563: stdout chunk (state=3): >>><<< 32134 1727204432.86566: stderr chunk (state=3): >>><<< 32134 1727204432.86592: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204432.86603: _low_level_execute_command(): starting 32134 1727204432.86613: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204432.4764197-32627-30209274948540/AnsiballZ_dnf.py && sleep 0' 32134 1727204432.87250: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204432.87263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204432.87306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204432.87323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204432.87420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 32134 1727204432.87438: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204432.87452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204432.87536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204434.38099: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 32134 1727204434.43420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204434.43425: stdout chunk (state=3): >>><<< 32134 1727204434.43431: stderr chunk (state=3): >>><<< 32134 1727204434.43455: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204434.43523: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204432.4764197-32627-30209274948540/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204434.43531: _low_level_execute_command(): starting 32134 1727204434.43539: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204432.4764197-32627-30209274948540/ > /dev/null 2>&1 && sleep 0' 32134 1727204434.44971: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204434.44998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204434.45206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204434.45284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204434.45294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204434.45440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204434.45619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204434.47659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204434.47759: stderr chunk (state=3): >>><<< 32134 1727204434.47763: stdout chunk (state=3): >>><<< 32134 1727204434.47881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204434.47893: handler run complete 32134 1727204434.48347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204434.48825: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204434.48877: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204434.49027: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204434.49063: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204434.49495: variable '__install_status' from source: unknown 32134 1727204434.49499: Evaluated conditional (__install_status is success): True 32134 1727204434.49501: attempt loop complete, returning result 32134 1727204434.49504: _execute() done 32134 1727204434.49506: dumping result to json 32134 1727204434.49508: done dumping result, returning 32134 1727204434.49567: done running TaskExecutor() for managed-node2/TASK: Install iproute [12b410aa-8751-753f-5162-0000000001b1] 32134 1727204434.49573: sending task result for task 12b410aa-8751-753f-5162-0000000001b1 ok: [managed-node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 32134 1727204434.49826: no more pending results, returning what we have 32134 1727204434.49831: results queue empty 32134 1727204434.49832: checking for any_errors_fatal 32134 1727204434.49837: done checking for any_errors_fatal 32134 1727204434.49838: checking for max_fail_percentage 32134 1727204434.49841: done checking for max_fail_percentage 32134 1727204434.49842: checking to see if all hosts have failed and the running result is not ok 32134 1727204434.49843: done checking to see if all hosts have failed 32134 1727204434.49844: getting the remaining hosts for this loop 32134 1727204434.49845: done getting the remaining hosts for this loop 32134 1727204434.49850: getting the next task for host managed-node2 32134 1727204434.49858: done getting next task for host managed-node2 32134 1727204434.49861: ^ task is: TASK: Create veth interface {{ interface }} 32134 1727204434.49864: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204434.49868: getting variables 32134 1727204434.49871: in VariableManager get_vars() 32134 1727204434.50139: Calling all_inventory to load vars for managed-node2 32134 1727204434.50143: Calling groups_inventory to load vars for managed-node2 32134 1727204434.50146: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204434.50161: Calling all_plugins_play to load vars for managed-node2 32134 1727204434.50164: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204434.50169: Calling groups_plugins_play to load vars for managed-node2 32134 1727204434.50886: done sending task result for task 12b410aa-8751-753f-5162-0000000001b1 32134 1727204434.50952: WORKER PROCESS EXITING 32134 1727204434.51112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204434.51761: done with get_vars() 32134 1727204434.51775: done getting variables 32134 1727204434.51924: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 32134 1727204434.52298: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:00:34 -0400 (0:00:02.137) 0:00:08.927 ***** 32134 1727204434.52336: entering _queue_task() for managed-node2/command 32134 1727204434.53075: worker is 1 (out of 1 available) 32134 1727204434.53088: exiting _queue_task() for managed-node2/command 32134 1727204434.53103: done queuing things up, now waiting for results queue to drain 32134 1727204434.53105: waiting for pending results... 32134 1727204434.53488: running TaskExecutor() for managed-node2/TASK: Create veth interface ethtest0 32134 1727204434.53725: in run() - task 12b410aa-8751-753f-5162-0000000001b2 32134 1727204434.53897: variable 'ansible_search_path' from source: unknown 32134 1727204434.53901: variable 'ansible_search_path' from source: unknown 32134 1727204434.54595: variable 'interface' from source: set_fact 32134 1727204434.54649: variable 'interface' from source: set_fact 32134 1727204434.54892: variable 'interface' from source: set_fact 32134 1727204434.55173: Loaded config def from plugin (lookup/items) 32134 1727204434.55395: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 32134 1727204434.55399: variable 'omit' from source: magic vars 32134 1727204434.55593: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204434.55711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204434.55734: variable 'omit' from source: magic vars 32134 1727204434.56695: variable 'ansible_distribution_major_version' from source: facts 32134 1727204434.56699: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204434.57129: variable 'type' from source: set_fact 32134 1727204434.57141: variable 'state' from source: include params 32134 1727204434.57151: variable 'interface' from source: set_fact 32134 1727204434.57160: variable 'current_interfaces' from source: set_fact 32134 1727204434.57171: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 32134 1727204434.57184: variable 'omit' from source: magic vars 32134 1727204434.57238: variable 'omit' from source: magic vars 32134 1727204434.57357: variable 'item' from source: unknown 32134 1727204434.57580: variable 'item' from source: unknown 32134 1727204434.57607: variable 'omit' from source: magic vars 32134 1727204434.57736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204434.57776: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204434.58095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204434.58098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204434.58101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204434.58103: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204434.58106: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204434.58108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204434.58323: Set connection var ansible_timeout to 10 32134 1727204434.58353: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204434.58362: Set connection var ansible_connection to ssh 32134 1727204434.58370: Set connection var ansible_shell_type to sh 32134 1727204434.58384: Set connection var ansible_shell_executable to /bin/sh 32134 1727204434.58399: Set connection var ansible_pipelining to False 32134 1727204434.58431: variable 'ansible_shell_executable' from source: unknown 32134 1727204434.58695: variable 'ansible_connection' from source: unknown 32134 1727204434.58698: variable 'ansible_module_compression' from source: unknown 32134 1727204434.58701: variable 'ansible_shell_type' from source: unknown 32134 1727204434.58704: variable 'ansible_shell_executable' from source: unknown 32134 1727204434.58706: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204434.58708: variable 'ansible_pipelining' from source: unknown 32134 1727204434.58710: variable 'ansible_timeout' from source: unknown 32134 1727204434.58715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204434.59096: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204434.59100: variable 'omit' from source: magic vars 32134 1727204434.59103: starting attempt loop 32134 1727204434.59106: running the handler 32134 1727204434.59108: _low_level_execute_command(): starting 32134 1727204434.59111: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204434.60700: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204434.60721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204434.60736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204434.60785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204434.60909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204434.60926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204434.61008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204434.62816: stdout chunk (state=3): >>>/root <<< 32134 1727204434.62997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204434.63070: stderr chunk (state=3): >>><<< 32134 1727204434.63345: stdout chunk (state=3): >>><<< 32134 1727204434.63351: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204434.63353: _low_level_execute_command(): starting 32134 1727204434.63357: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204434.6323526-32867-138977380416599 `" && echo ansible-tmp-1727204434.6323526-32867-138977380416599="` echo /root/.ansible/tmp/ansible-tmp-1727204434.6323526-32867-138977380416599 `" ) && sleep 0' 32134 1727204434.64547: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204434.64551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204434.64554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204434.64556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204434.64734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204434.64749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204434.64849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204434.67044: stdout chunk (state=3): >>>ansible-tmp-1727204434.6323526-32867-138977380416599=/root/.ansible/tmp/ansible-tmp-1727204434.6323526-32867-138977380416599 <<< 32134 1727204434.67165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204434.67266: stderr chunk (state=3): >>><<< 32134 1727204434.67596: stdout chunk (state=3): >>><<< 32134 1727204434.67600: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204434.6323526-32867-138977380416599=/root/.ansible/tmp/ansible-tmp-1727204434.6323526-32867-138977380416599 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204434.67603: variable 'ansible_module_compression' from source: unknown 32134 1727204434.67605: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32134 1727204434.67607: variable 'ansible_facts' from source: unknown 32134 1727204434.67860: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204434.6323526-32867-138977380416599/AnsiballZ_command.py 32134 1727204434.68221: Sending initial data 32134 1727204434.68226: Sent initial data (156 bytes) 32134 1727204434.69373: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204434.69388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204434.69508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204434.69522: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204434.69590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204434.69604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204434.69833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204434.69900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204434.71631: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 32134 1727204434.71649: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204434.71666: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204434.71817: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204434.6323526-32867-138977380416599/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpxxldjr1z" to remote "/root/.ansible/tmp/ansible-tmp-1727204434.6323526-32867-138977380416599/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204434.6323526-32867-138977380416599/AnsiballZ_command.py" <<< 32134 1727204434.71821: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpxxldjr1z /root/.ansible/tmp/ansible-tmp-1727204434.6323526-32867-138977380416599/AnsiballZ_command.py <<< 32134 1727204434.73796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204434.74024: stderr chunk (state=3): >>><<< 32134 1727204434.74028: stdout chunk (state=3): >>><<< 32134 1727204434.74056: done transferring module to remote 32134 1727204434.74098: _low_level_execute_command(): starting 32134 1727204434.74106: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204434.6323526-32867-138977380416599/ /root/.ansible/tmp/ansible-tmp-1727204434.6323526-32867-138977380416599/AnsiballZ_command.py && sleep 0' 32134 1727204434.75468: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204434.75579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204434.75594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204434.75612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204434.75629: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204434.75638: stderr chunk (state=3): >>>debug2: match not found <<< 32134 1727204434.75648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204434.75664: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32134 1727204434.75683: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 32134 1727204434.75693: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32134 1727204434.75880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204434.75904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204434.76228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204434.78304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204434.78308: stdout chunk (state=3): >>><<< 32134 1727204434.78311: stderr chunk (state=3): >>><<< 32134 1727204434.78328: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204434.78331: _low_level_execute_command(): starting 32134 1727204434.78339: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204434.6323526-32867-138977380416599/AnsiballZ_command.py && sleep 0' 32134 1727204434.79897: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204434.79901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204434.79994: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204434.80075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204434.80084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204434.80093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204434.98897: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 15:00:34.978730", "end": "2024-09-24 15:00:34.984156", "delta": "0:00:00.005426", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32134 1727204435.02312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204435.02402: stderr chunk (state=3): >>><<< 32134 1727204435.02406: stdout chunk (state=3): >>><<< 32134 1727204435.02431: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 15:00:34.978730", "end": "2024-09-24 15:00:34.984156", "delta": "0:00:00.005426", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204435.02482: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204434.6323526-32867-138977380416599/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204435.02608: _low_level_execute_command(): starting 32134 1727204435.02612: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204434.6323526-32867-138977380416599/ > /dev/null 2>&1 && sleep 0' 32134 1727204435.03997: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204435.04043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204435.04047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204435.04249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204435.09104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204435.09194: stderr chunk (state=3): >>><<< 32134 1727204435.09197: stdout chunk (state=3): >>><<< 32134 1727204435.09315: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204435.09327: handler run complete 32134 1727204435.09363: Evaluated conditional (False): False 32134 1727204435.09377: attempt loop complete, returning result 32134 1727204435.09402: variable 'item' from source: unknown 32134 1727204435.09679: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.005426", "end": "2024-09-24 15:00:34.984156", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-24 15:00:34.978730" } 32134 1727204435.10094: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204435.10098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204435.10101: variable 'omit' from source: magic vars 32134 1727204435.10535: variable 'ansible_distribution_major_version' from source: facts 32134 1727204435.10542: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204435.11294: variable 'type' from source: set_fact 32134 1727204435.11298: variable 'state' from source: include params 32134 1727204435.11300: variable 'interface' from source: set_fact 32134 1727204435.11302: variable 'current_interfaces' from source: set_fact 32134 1727204435.11304: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 32134 1727204435.11306: variable 'omit' from source: magic vars 32134 1727204435.11308: variable 'omit' from source: magic vars 32134 1727204435.11511: variable 'item' from source: unknown 32134 1727204435.11592: variable 'item' from source: unknown 32134 1727204435.11623: variable 'omit' from source: magic vars 32134 1727204435.11650: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204435.11660: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204435.11669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204435.11688: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204435.11693: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204435.11698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204435.11967: Set connection var ansible_timeout to 10 32134 1727204435.11983: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204435.11987: Set connection var ansible_connection to ssh 32134 1727204435.11991: Set connection var ansible_shell_type to sh 32134 1727204435.12000: Set connection var ansible_shell_executable to /bin/sh 32134 1727204435.12007: Set connection var ansible_pipelining to False 32134 1727204435.12154: variable 'ansible_shell_executable' from source: unknown 32134 1727204435.12159: variable 'ansible_connection' from source: unknown 32134 1727204435.12162: variable 'ansible_module_compression' from source: unknown 32134 1727204435.12167: variable 'ansible_shell_type' from source: unknown 32134 1727204435.12170: variable 'ansible_shell_executable' from source: unknown 32134 1727204435.12175: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204435.12181: variable 'ansible_pipelining' from source: unknown 32134 1727204435.12184: variable 'ansible_timeout' from source: unknown 32134 1727204435.12192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204435.12501: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204435.12695: variable 'omit' from source: magic vars 32134 1727204435.12699: starting attempt loop 32134 1727204435.12702: running the handler 32134 1727204435.12705: _low_level_execute_command(): starting 32134 1727204435.12708: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204435.13855: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204435.13863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204435.14175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204435.14180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204435.14188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204435.14192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204435.14227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204435.14285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204435.16681: stdout chunk (state=3): >>>/root <<< 32134 1727204435.16840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204435.16956: stderr chunk (state=3): >>><<< 32134 1727204435.16966: stdout chunk (state=3): >>><<< 32134 1727204435.17027: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204435.17038: _low_level_execute_command(): starting 32134 1727204435.17045: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204435.1702678-32867-162934703747758 `" && echo ansible-tmp-1727204435.1702678-32867-162934703747758="` echo /root/.ansible/tmp/ansible-tmp-1727204435.1702678-32867-162934703747758 `" ) && sleep 0' 32134 1727204435.18385: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204435.18391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204435.18394: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204435.18396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204435.18399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204435.18619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204435.18671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204435.21654: stdout chunk (state=3): >>>ansible-tmp-1727204435.1702678-32867-162934703747758=/root/.ansible/tmp/ansible-tmp-1727204435.1702678-32867-162934703747758 <<< 32134 1727204435.22002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204435.22006: stderr chunk (state=3): >>><<< 32134 1727204435.22011: stdout chunk (state=3): >>><<< 32134 1727204435.22107: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204435.1702678-32867-162934703747758=/root/.ansible/tmp/ansible-tmp-1727204435.1702678-32867-162934703747758 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204435.22139: variable 'ansible_module_compression' from source: unknown 32134 1727204435.22180: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32134 1727204435.22320: variable 'ansible_facts' from source: unknown 32134 1727204435.22501: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204435.1702678-32867-162934703747758/AnsiballZ_command.py 32134 1727204435.23120: Sending initial data 32134 1727204435.23123: Sent initial data (156 bytes) 32134 1727204435.24584: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204435.24698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204435.24777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204435.27251: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204435.27284: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204435.27354: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmptdfqhnpt /root/.ansible/tmp/ansible-tmp-1727204435.1702678-32867-162934703747758/AnsiballZ_command.py <<< 32134 1727204435.27358: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204435.1702678-32867-162934703747758/AnsiballZ_command.py" <<< 32134 1727204435.27419: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmptdfqhnpt" to remote "/root/.ansible/tmp/ansible-tmp-1727204435.1702678-32867-162934703747758/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204435.1702678-32867-162934703747758/AnsiballZ_command.py" <<< 32134 1727204435.29332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204435.29783: stderr chunk (state=3): >>><<< 32134 1727204435.29787: stdout chunk (state=3): >>><<< 32134 1727204435.29792: done transferring module to remote 32134 1727204435.29795: _low_level_execute_command(): starting 32134 1727204435.29798: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204435.1702678-32867-162934703747758/ /root/.ansible/tmp/ansible-tmp-1727204435.1702678-32867-162934703747758/AnsiballZ_command.py && sleep 0' 32134 1727204435.31181: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204435.31305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204435.31322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204435.31333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204435.31517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204435.34299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204435.34304: stderr chunk (state=3): >>><<< 32134 1727204435.34521: stdout chunk (state=3): >>><<< 32134 1727204435.34541: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204435.34544: _low_level_execute_command(): starting 32134 1727204435.34551: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204435.1702678-32867-162934703747758/AnsiballZ_command.py && sleep 0' 32134 1727204435.35664: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204435.35907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204435.36007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204435.36102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204435.64478: stdout chunk (state=3): >>> <<< 32134 1727204435.64503: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 15:00:35.638201", "end": "2024-09-24 15:00:35.643605", "delta": "0:00:00.005404", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32134 1727204435.67164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204435.67167: stdout chunk (state=3): >>><<< 32134 1727204435.67170: stderr chunk (state=3): >>><<< 32134 1727204435.67209: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 15:00:35.638201", "end": "2024-09-24 15:00:35.643605", "delta": "0:00:00.005404", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204435.67281: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204435.1702678-32867-162934703747758/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204435.67298: _low_level_execute_command(): starting 32134 1727204435.67314: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204435.1702678-32867-162934703747758/ > /dev/null 2>&1 && sleep 0' 32134 1727204435.68062: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204435.68115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204435.68198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32134 1727204435.68221: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204435.68254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204435.68270: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204435.68303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204435.68421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204435.71207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204435.71298: stderr chunk (state=3): >>><<< 32134 1727204435.71310: stdout chunk (state=3): >>><<< 32134 1727204435.71335: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204435.71348: handler run complete 32134 1727204435.71500: Evaluated conditional (False): False 32134 1727204435.71503: attempt loop complete, returning result 32134 1727204435.71506: variable 'item' from source: unknown 32134 1727204435.71541: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.005404", "end": "2024-09-24 15:00:35.643605", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-24 15:00:35.638201" } 32134 1727204435.71794: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204435.71798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204435.71802: variable 'omit' from source: magic vars 32134 1727204435.71986: variable 'ansible_distribution_major_version' from source: facts 32134 1727204435.71995: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204435.72304: variable 'type' from source: set_fact 32134 1727204435.72332: variable 'state' from source: include params 32134 1727204435.72341: variable 'interface' from source: set_fact 32134 1727204435.72345: variable 'current_interfaces' from source: set_fact 32134 1727204435.72355: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 32134 1727204435.72358: variable 'omit' from source: magic vars 32134 1727204435.72360: variable 'omit' from source: magic vars 32134 1727204435.72431: variable 'item' from source: unknown 32134 1727204435.72479: variable 'item' from source: unknown 32134 1727204435.72494: variable 'omit' from source: magic vars 32134 1727204435.72516: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204435.72526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204435.72529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204435.72546: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204435.72549: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204435.72555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204435.72648: Set connection var ansible_timeout to 10 32134 1727204435.72652: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204435.72655: Set connection var ansible_connection to ssh 32134 1727204435.72657: Set connection var ansible_shell_type to sh 32134 1727204435.72662: Set connection var ansible_shell_executable to /bin/sh 32134 1727204435.72669: Set connection var ansible_pipelining to False 32134 1727204435.72707: variable 'ansible_shell_executable' from source: unknown 32134 1727204435.72710: variable 'ansible_connection' from source: unknown 32134 1727204435.72715: variable 'ansible_module_compression' from source: unknown 32134 1727204435.72718: variable 'ansible_shell_type' from source: unknown 32134 1727204435.72720: variable 'ansible_shell_executable' from source: unknown 32134 1727204435.72723: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204435.72725: variable 'ansible_pipelining' from source: unknown 32134 1727204435.72727: variable 'ansible_timeout' from source: unknown 32134 1727204435.72729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204435.72837: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204435.72845: variable 'omit' from source: magic vars 32134 1727204435.72851: starting attempt loop 32134 1727204435.72854: running the handler 32134 1727204435.72877: _low_level_execute_command(): starting 32134 1727204435.72882: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204435.73486: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204435.73491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204435.73494: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204435.73496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204435.73573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204435.73606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204435.73649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204435.76071: stdout chunk (state=3): >>>/root <<< 32134 1727204435.76447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204435.76453: stdout chunk (state=3): >>><<< 32134 1727204435.76459: stderr chunk (state=3): >>><<< 32134 1727204435.76462: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204435.76464: _low_level_execute_command(): starting 32134 1727204435.76466: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204435.7634633-32867-59383844432134 `" && echo ansible-tmp-1727204435.7634633-32867-59383844432134="` echo /root/.ansible/tmp/ansible-tmp-1727204435.7634633-32867-59383844432134 `" ) && sleep 0' 32134 1727204435.76943: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204435.76947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204435.76949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204435.76952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204435.76954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204435.76995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204435.77000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204435.77055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204435.79947: stdout chunk (state=3): >>>ansible-tmp-1727204435.7634633-32867-59383844432134=/root/.ansible/tmp/ansible-tmp-1727204435.7634633-32867-59383844432134 <<< 32134 1727204435.80184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204435.80187: stdout chunk (state=3): >>><<< 32134 1727204435.80197: stderr chunk (state=3): >>><<< 32134 1727204435.80221: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204435.7634633-32867-59383844432134=/root/.ansible/tmp/ansible-tmp-1727204435.7634633-32867-59383844432134 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204435.80262: variable 'ansible_module_compression' from source: unknown 32134 1727204435.80367: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32134 1727204435.80370: variable 'ansible_facts' from source: unknown 32134 1727204435.80457: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204435.7634633-32867-59383844432134/AnsiballZ_command.py 32134 1727204435.80631: Sending initial data 32134 1727204435.80635: Sent initial data (155 bytes) 32134 1727204435.81897: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 32134 1727204435.81985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204435.82034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204435.84411: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204435.84588: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204435.84592: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204435.7634633-32867-59383844432134/AnsiballZ_command.py" <<< 32134 1727204435.84597: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpbymf_lc7 /root/.ansible/tmp/ansible-tmp-1727204435.7634633-32867-59383844432134/AnsiballZ_command.py <<< 32134 1727204435.84756: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpbymf_lc7" to remote "/root/.ansible/tmp/ansible-tmp-1727204435.7634633-32867-59383844432134/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204435.7634633-32867-59383844432134/AnsiballZ_command.py" <<< 32134 1727204435.87184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204435.87315: stderr chunk (state=3): >>><<< 32134 1727204435.87319: stdout chunk (state=3): >>><<< 32134 1727204435.87321: done transferring module to remote 32134 1727204435.87345: _low_level_execute_command(): starting 32134 1727204435.87356: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204435.7634633-32867-59383844432134/ /root/.ansible/tmp/ansible-tmp-1727204435.7634633-32867-59383844432134/AnsiballZ_command.py && sleep 0' 32134 1727204435.88105: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204435.88109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204435.88111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204435.88114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204435.88122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204435.88131: stderr chunk (state=3): >>>debug2: match not found <<< 32134 1727204435.88142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204435.88158: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32134 1727204435.88166: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 32134 1727204435.88175: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32134 1727204435.88215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204435.88219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204435.88223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204435.88225: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204435.88233: stderr chunk (state=3): >>>debug2: match found <<< 32134 1727204435.88296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204435.88327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204435.88356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204435.88359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204435.88427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204435.90841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204435.91117: stderr chunk (state=3): >>><<< 32134 1727204435.91125: stdout chunk (state=3): >>><<< 32134 1727204435.91155: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204435.91158: _low_level_execute_command(): starting 32134 1727204435.91161: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204435.7634633-32867-59383844432134/AnsiballZ_command.py && sleep 0' 32134 1727204435.92511: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204435.92618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204435.92843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204435.92851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204435.92934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204436.11329: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 15:00:36.107877", "end": "2024-09-24 15:00:36.111764", "delta": "0:00:00.003887", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32134 1727204436.13697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204436.13701: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 32134 1727204436.13752: stderr chunk (state=3): >>><<< 32134 1727204436.13756: stdout chunk (state=3): >>><<< 32134 1727204436.13780: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 15:00:36.107877", "end": "2024-09-24 15:00:36.111764", "delta": "0:00:00.003887", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204436.13853: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204435.7634633-32867-59383844432134/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204436.13866: _low_level_execute_command(): starting 32134 1727204436.13886: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204435.7634633-32867-59383844432134/ > /dev/null 2>&1 && sleep 0' 32134 1727204436.14362: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204436.14396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204436.14399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204436.14402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32134 1727204436.14409: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204436.14425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204436.14479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204436.14482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204436.14520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204436.16455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204436.16514: stderr chunk (state=3): >>><<< 32134 1727204436.16518: stdout chunk (state=3): >>><<< 32134 1727204436.16556: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204436.16560: handler run complete 32134 1727204436.16672: Evaluated conditional (False): False 32134 1727204436.16676: attempt loop complete, returning result 32134 1727204436.16678: variable 'item' from source: unknown 32134 1727204436.16792: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.003887", "end": "2024-09-24 15:00:36.111764", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-24 15:00:36.107877" } 32134 1727204436.16906: dumping result to json 32134 1727204436.16910: done dumping result, returning 32134 1727204436.16916: done running TaskExecutor() for managed-node2/TASK: Create veth interface ethtest0 [12b410aa-8751-753f-5162-0000000001b2] 32134 1727204436.16919: sending task result for task 12b410aa-8751-753f-5162-0000000001b2 32134 1727204436.17485: no more pending results, returning what we have 32134 1727204436.17491: results queue empty 32134 1727204436.17493: checking for any_errors_fatal 32134 1727204436.17499: done checking for any_errors_fatal 32134 1727204436.17500: checking for max_fail_percentage 32134 1727204436.17502: done checking for max_fail_percentage 32134 1727204436.17503: checking to see if all hosts have failed and the running result is not ok 32134 1727204436.17504: done checking to see if all hosts have failed 32134 1727204436.17505: getting the remaining hosts for this loop 32134 1727204436.17506: done getting the remaining hosts for this loop 32134 1727204436.17510: getting the next task for host managed-node2 32134 1727204436.17517: done getting next task for host managed-node2 32134 1727204436.17520: ^ task is: TASK: Set up veth as managed by NetworkManager 32134 1727204436.17523: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204436.17532: getting variables 32134 1727204436.17534: in VariableManager get_vars() 32134 1727204436.17567: Calling all_inventory to load vars for managed-node2 32134 1727204436.17571: Calling groups_inventory to load vars for managed-node2 32134 1727204436.17574: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204436.17585: Calling all_plugins_play to load vars for managed-node2 32134 1727204436.17588: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204436.17612: done sending task result for task 12b410aa-8751-753f-5162-0000000001b2 32134 1727204436.17616: WORKER PROCESS EXITING 32134 1727204436.17622: Calling groups_plugins_play to load vars for managed-node2 32134 1727204436.17815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204436.18003: done with get_vars() 32134 1727204436.18013: done getting variables 32134 1727204436.18063: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:00:36 -0400 (0:00:01.657) 0:00:10.584 ***** 32134 1727204436.18088: entering _queue_task() for managed-node2/command 32134 1727204436.18323: worker is 1 (out of 1 available) 32134 1727204436.18336: exiting _queue_task() for managed-node2/command 32134 1727204436.18353: done queuing things up, now waiting for results queue to drain 32134 1727204436.18355: waiting for pending results... 32134 1727204436.18537: running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager 32134 1727204436.18608: in run() - task 12b410aa-8751-753f-5162-0000000001b3 32134 1727204436.18624: variable 'ansible_search_path' from source: unknown 32134 1727204436.18628: variable 'ansible_search_path' from source: unknown 32134 1727204436.18659: calling self._execute() 32134 1727204436.18739: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204436.18746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204436.18757: variable 'omit' from source: magic vars 32134 1727204436.19069: variable 'ansible_distribution_major_version' from source: facts 32134 1727204436.19079: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204436.19221: variable 'type' from source: set_fact 32134 1727204436.19224: variable 'state' from source: include params 32134 1727204436.19232: Evaluated conditional (type == 'veth' and state == 'present'): True 32134 1727204436.19240: variable 'omit' from source: magic vars 32134 1727204436.19275: variable 'omit' from source: magic vars 32134 1727204436.19362: variable 'interface' from source: set_fact 32134 1727204436.19379: variable 'omit' from source: magic vars 32134 1727204436.19415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204436.19450: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204436.19469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204436.19487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204436.19501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204436.19531: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204436.19535: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204436.19540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204436.19631: Set connection var ansible_timeout to 10 32134 1727204436.19644: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204436.19647: Set connection var ansible_connection to ssh 32134 1727204436.19649: Set connection var ansible_shell_type to sh 32134 1727204436.19657: Set connection var ansible_shell_executable to /bin/sh 32134 1727204436.19666: Set connection var ansible_pipelining to False 32134 1727204436.19686: variable 'ansible_shell_executable' from source: unknown 32134 1727204436.19691: variable 'ansible_connection' from source: unknown 32134 1727204436.19694: variable 'ansible_module_compression' from source: unknown 32134 1727204436.19700: variable 'ansible_shell_type' from source: unknown 32134 1727204436.19703: variable 'ansible_shell_executable' from source: unknown 32134 1727204436.19705: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204436.19711: variable 'ansible_pipelining' from source: unknown 32134 1727204436.19717: variable 'ansible_timeout' from source: unknown 32134 1727204436.19722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204436.19845: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204436.19856: variable 'omit' from source: magic vars 32134 1727204436.19862: starting attempt loop 32134 1727204436.19865: running the handler 32134 1727204436.19884: _low_level_execute_command(): starting 32134 1727204436.19891: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204436.20418: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204436.20458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204436.20463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204436.20519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204436.20523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204436.20575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204436.22331: stdout chunk (state=3): >>>/root <<< 32134 1727204436.22442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204436.22500: stderr chunk (state=3): >>><<< 32134 1727204436.22503: stdout chunk (state=3): >>><<< 32134 1727204436.22529: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204436.22545: _low_level_execute_command(): starting 32134 1727204436.22551: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204436.225285-32919-250012680946318 `" && echo ansible-tmp-1727204436.225285-32919-250012680946318="` echo /root/.ansible/tmp/ansible-tmp-1727204436.225285-32919-250012680946318 `" ) && sleep 0' 32134 1727204436.23039: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204436.23043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204436.23055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204436.23058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204436.23060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204436.23106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204436.23110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204436.23156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204436.25180: stdout chunk (state=3): >>>ansible-tmp-1727204436.225285-32919-250012680946318=/root/.ansible/tmp/ansible-tmp-1727204436.225285-32919-250012680946318 <<< 32134 1727204436.25299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204436.25350: stderr chunk (state=3): >>><<< 32134 1727204436.25354: stdout chunk (state=3): >>><<< 32134 1727204436.25371: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204436.225285-32919-250012680946318=/root/.ansible/tmp/ansible-tmp-1727204436.225285-32919-250012680946318 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204436.25401: variable 'ansible_module_compression' from source: unknown 32134 1727204436.25449: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32134 1727204436.25481: variable 'ansible_facts' from source: unknown 32134 1727204436.25548: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204436.225285-32919-250012680946318/AnsiballZ_command.py 32134 1727204436.25670: Sending initial data 32134 1727204436.25674: Sent initial data (155 bytes) 32134 1727204436.26141: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204436.26144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204436.26147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204436.26150: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204436.26202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204436.26208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204436.26246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204436.27903: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 32134 1727204436.27908: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204436.27941: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204436.27978: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmp07we5vbv /root/.ansible/tmp/ansible-tmp-1727204436.225285-32919-250012680946318/AnsiballZ_command.py <<< 32134 1727204436.27986: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204436.225285-32919-250012680946318/AnsiballZ_command.py" <<< 32134 1727204436.28016: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmp07we5vbv" to remote "/root/.ansible/tmp/ansible-tmp-1727204436.225285-32919-250012680946318/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204436.225285-32919-250012680946318/AnsiballZ_command.py" <<< 32134 1727204436.28807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204436.28870: stderr chunk (state=3): >>><<< 32134 1727204436.28875: stdout chunk (state=3): >>><<< 32134 1727204436.28893: done transferring module to remote 32134 1727204436.28903: _low_level_execute_command(): starting 32134 1727204436.28909: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204436.225285-32919-250012680946318/ /root/.ansible/tmp/ansible-tmp-1727204436.225285-32919-250012680946318/AnsiballZ_command.py && sleep 0' 32134 1727204436.29364: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204436.29367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204436.29372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32134 1727204436.29374: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204436.29379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204436.29435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204436.29440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204436.29479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204436.31424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204436.31472: stderr chunk (state=3): >>><<< 32134 1727204436.31475: stdout chunk (state=3): >>><<< 32134 1727204436.31491: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204436.31495: _low_level_execute_command(): starting 32134 1727204436.31501: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204436.225285-32919-250012680946318/AnsiballZ_command.py && sleep 0' 32134 1727204436.31949: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204436.31952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204436.31955: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204436.31957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204436.32016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204436.32019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204436.32066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204436.62419: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 15:00:36.585297", "end": "2024-09-24 15:00:36.620544", "delta": "0:00:00.035247", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32134 1727204436.64548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204436.64565: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 32134 1727204436.64664: stderr chunk (state=3): >>><<< 32134 1727204436.64918: stdout chunk (state=3): >>><<< 32134 1727204436.64943: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 15:00:36.585297", "end": "2024-09-24 15:00:36.620544", "delta": "0:00:00.035247", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204436.65079: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204436.225285-32919-250012680946318/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204436.65105: _low_level_execute_command(): starting 32134 1727204436.65115: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204436.225285-32919-250012680946318/ > /dev/null 2>&1 && sleep 0' 32134 1727204436.65816: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204436.65918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204436.65960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204436.65981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204436.66001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204436.66077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204436.68887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204436.68903: stdout chunk (state=3): >>><<< 32134 1727204436.68920: stderr chunk (state=3): >>><<< 32134 1727204436.68943: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204436.69005: handler run complete 32134 1727204436.69179: Evaluated conditional (False): False 32134 1727204436.69186: attempt loop complete, returning result 32134 1727204436.69192: _execute() done 32134 1727204436.69194: dumping result to json 32134 1727204436.69196: done dumping result, returning 32134 1727204436.69199: done running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager [12b410aa-8751-753f-5162-0000000001b3] 32134 1727204436.69201: sending task result for task 12b410aa-8751-753f-5162-0000000001b3 32134 1727204436.69427: done sending task result for task 12b410aa-8751-753f-5162-0000000001b3 32134 1727204436.69430: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.035247", "end": "2024-09-24 15:00:36.620544", "rc": 0, "start": "2024-09-24 15:00:36.585297" } 32134 1727204436.69516: no more pending results, returning what we have 32134 1727204436.69519: results queue empty 32134 1727204436.69521: checking for any_errors_fatal 32134 1727204436.69535: done checking for any_errors_fatal 32134 1727204436.69536: checking for max_fail_percentage 32134 1727204436.69538: done checking for max_fail_percentage 32134 1727204436.69539: checking to see if all hosts have failed and the running result is not ok 32134 1727204436.69540: done checking to see if all hosts have failed 32134 1727204436.69541: getting the remaining hosts for this loop 32134 1727204436.69543: done getting the remaining hosts for this loop 32134 1727204436.69548: getting the next task for host managed-node2 32134 1727204436.69555: done getting next task for host managed-node2 32134 1727204436.69559: ^ task is: TASK: Delete veth interface {{ interface }} 32134 1727204436.69563: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204436.69567: getting variables 32134 1727204436.69569: in VariableManager get_vars() 32134 1727204436.69734: Calling all_inventory to load vars for managed-node2 32134 1727204436.69738: Calling groups_inventory to load vars for managed-node2 32134 1727204436.69741: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204436.69753: Calling all_plugins_play to load vars for managed-node2 32134 1727204436.69757: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204436.69761: Calling groups_plugins_play to load vars for managed-node2 32134 1727204436.70018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204436.70358: done with get_vars() 32134 1727204436.70372: done getting variables 32134 1727204436.70442: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 32134 1727204436.70575: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:00:36 -0400 (0:00:00.525) 0:00:11.110 ***** 32134 1727204436.70611: entering _queue_task() for managed-node2/command 32134 1727204436.70906: worker is 1 (out of 1 available) 32134 1727204436.70920: exiting _queue_task() for managed-node2/command 32134 1727204436.70933: done queuing things up, now waiting for results queue to drain 32134 1727204436.70935: waiting for pending results... 32134 1727204436.71719: running TaskExecutor() for managed-node2/TASK: Delete veth interface ethtest0 32134 1727204436.71725: in run() - task 12b410aa-8751-753f-5162-0000000001b4 32134 1727204436.71729: variable 'ansible_search_path' from source: unknown 32134 1727204436.71732: variable 'ansible_search_path' from source: unknown 32134 1727204436.71735: calling self._execute() 32134 1727204436.71988: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204436.72012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204436.72096: variable 'omit' from source: magic vars 32134 1727204436.72896: variable 'ansible_distribution_major_version' from source: facts 32134 1727204436.72899: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204436.73237: variable 'type' from source: set_fact 32134 1727204436.73254: variable 'state' from source: include params 32134 1727204436.73282: variable 'interface' from source: set_fact 32134 1727204436.73286: variable 'current_interfaces' from source: set_fact 32134 1727204436.73292: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 32134 1727204436.73295: when evaluation is False, skipping this task 32134 1727204436.73298: _execute() done 32134 1727204436.73300: dumping result to json 32134 1727204436.73303: done dumping result, returning 32134 1727204436.73341: done running TaskExecutor() for managed-node2/TASK: Delete veth interface ethtest0 [12b410aa-8751-753f-5162-0000000001b4] 32134 1727204436.73344: sending task result for task 12b410aa-8751-753f-5162-0000000001b4 32134 1727204436.73431: done sending task result for task 12b410aa-8751-753f-5162-0000000001b4 32134 1727204436.73435: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 32134 1727204436.73488: no more pending results, returning what we have 32134 1727204436.73494: results queue empty 32134 1727204436.73496: checking for any_errors_fatal 32134 1727204436.73507: done checking for any_errors_fatal 32134 1727204436.73508: checking for max_fail_percentage 32134 1727204436.73510: done checking for max_fail_percentage 32134 1727204436.73511: checking to see if all hosts have failed and the running result is not ok 32134 1727204436.73512: done checking to see if all hosts have failed 32134 1727204436.73512: getting the remaining hosts for this loop 32134 1727204436.73514: done getting the remaining hosts for this loop 32134 1727204436.73518: getting the next task for host managed-node2 32134 1727204436.73526: done getting next task for host managed-node2 32134 1727204436.73530: ^ task is: TASK: Create dummy interface {{ interface }} 32134 1727204436.73533: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204436.73537: getting variables 32134 1727204436.73538: in VariableManager get_vars() 32134 1727204436.73575: Calling all_inventory to load vars for managed-node2 32134 1727204436.73577: Calling groups_inventory to load vars for managed-node2 32134 1727204436.73580: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204436.73594: Calling all_plugins_play to load vars for managed-node2 32134 1727204436.73597: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204436.73600: Calling groups_plugins_play to load vars for managed-node2 32134 1727204436.73778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204436.73964: done with get_vars() 32134 1727204436.73974: done getting variables 32134 1727204436.74025: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 32134 1727204436.74121: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:00:36 -0400 (0:00:00.035) 0:00:11.145 ***** 32134 1727204436.74147: entering _queue_task() for managed-node2/command 32134 1727204436.74372: worker is 1 (out of 1 available) 32134 1727204436.74388: exiting _queue_task() for managed-node2/command 32134 1727204436.74402: done queuing things up, now waiting for results queue to drain 32134 1727204436.74405: waiting for pending results... 32134 1727204436.74576: running TaskExecutor() for managed-node2/TASK: Create dummy interface ethtest0 32134 1727204436.74660: in run() - task 12b410aa-8751-753f-5162-0000000001b5 32134 1727204436.74672: variable 'ansible_search_path' from source: unknown 32134 1727204436.74675: variable 'ansible_search_path' from source: unknown 32134 1727204436.74708: calling self._execute() 32134 1727204436.74781: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204436.74788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204436.74799: variable 'omit' from source: magic vars 32134 1727204436.75098: variable 'ansible_distribution_major_version' from source: facts 32134 1727204436.75109: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204436.75282: variable 'type' from source: set_fact 32134 1727204436.75287: variable 'state' from source: include params 32134 1727204436.75291: variable 'interface' from source: set_fact 32134 1727204436.75305: variable 'current_interfaces' from source: set_fact 32134 1727204436.75309: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 32134 1727204436.75312: when evaluation is False, skipping this task 32134 1727204436.75314: _execute() done 32134 1727204436.75320: dumping result to json 32134 1727204436.75324: done dumping result, returning 32134 1727204436.75332: done running TaskExecutor() for managed-node2/TASK: Create dummy interface ethtest0 [12b410aa-8751-753f-5162-0000000001b5] 32134 1727204436.75337: sending task result for task 12b410aa-8751-753f-5162-0000000001b5 32134 1727204436.75437: done sending task result for task 12b410aa-8751-753f-5162-0000000001b5 32134 1727204436.75441: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 32134 1727204436.75496: no more pending results, returning what we have 32134 1727204436.75500: results queue empty 32134 1727204436.75501: checking for any_errors_fatal 32134 1727204436.75507: done checking for any_errors_fatal 32134 1727204436.75508: checking for max_fail_percentage 32134 1727204436.75509: done checking for max_fail_percentage 32134 1727204436.75512: checking to see if all hosts have failed and the running result is not ok 32134 1727204436.75513: done checking to see if all hosts have failed 32134 1727204436.75514: getting the remaining hosts for this loop 32134 1727204436.75515: done getting the remaining hosts for this loop 32134 1727204436.75519: getting the next task for host managed-node2 32134 1727204436.75524: done getting next task for host managed-node2 32134 1727204436.75527: ^ task is: TASK: Delete dummy interface {{ interface }} 32134 1727204436.75530: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204436.75534: getting variables 32134 1727204436.75535: in VariableManager get_vars() 32134 1727204436.75571: Calling all_inventory to load vars for managed-node2 32134 1727204436.75574: Calling groups_inventory to load vars for managed-node2 32134 1727204436.75576: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204436.75587: Calling all_plugins_play to load vars for managed-node2 32134 1727204436.75600: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204436.75604: Calling groups_plugins_play to load vars for managed-node2 32134 1727204436.75846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204436.76202: done with get_vars() 32134 1727204436.76215: done getting variables 32134 1727204436.76300: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 32134 1727204436.76432: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:00:36 -0400 (0:00:00.023) 0:00:11.168 ***** 32134 1727204436.76474: entering _queue_task() for managed-node2/command 32134 1727204436.76756: worker is 1 (out of 1 available) 32134 1727204436.76771: exiting _queue_task() for managed-node2/command 32134 1727204436.76782: done queuing things up, now waiting for results queue to drain 32134 1727204436.76784: waiting for pending results... 32134 1727204436.77081: running TaskExecutor() for managed-node2/TASK: Delete dummy interface ethtest0 32134 1727204436.77168: in run() - task 12b410aa-8751-753f-5162-0000000001b6 32134 1727204436.77181: variable 'ansible_search_path' from source: unknown 32134 1727204436.77184: variable 'ansible_search_path' from source: unknown 32134 1727204436.77242: calling self._execute() 32134 1727204436.77288: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204436.77296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204436.77306: variable 'omit' from source: magic vars 32134 1727204436.77608: variable 'ansible_distribution_major_version' from source: facts 32134 1727204436.77619: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204436.77786: variable 'type' from source: set_fact 32134 1727204436.77797: variable 'state' from source: include params 32134 1727204436.77801: variable 'interface' from source: set_fact 32134 1727204436.77808: variable 'current_interfaces' from source: set_fact 32134 1727204436.77817: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 32134 1727204436.77820: when evaluation is False, skipping this task 32134 1727204436.77825: _execute() done 32134 1727204436.77828: dumping result to json 32134 1727204436.77833: done dumping result, returning 32134 1727204436.77840: done running TaskExecutor() for managed-node2/TASK: Delete dummy interface ethtest0 [12b410aa-8751-753f-5162-0000000001b6] 32134 1727204436.77847: sending task result for task 12b410aa-8751-753f-5162-0000000001b6 32134 1727204436.77936: done sending task result for task 12b410aa-8751-753f-5162-0000000001b6 32134 1727204436.77939: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 32134 1727204436.78000: no more pending results, returning what we have 32134 1727204436.78004: results queue empty 32134 1727204436.78006: checking for any_errors_fatal 32134 1727204436.78011: done checking for any_errors_fatal 32134 1727204436.78012: checking for max_fail_percentage 32134 1727204436.78014: done checking for max_fail_percentage 32134 1727204436.78015: checking to see if all hosts have failed and the running result is not ok 32134 1727204436.78016: done checking to see if all hosts have failed 32134 1727204436.78017: getting the remaining hosts for this loop 32134 1727204436.78018: done getting the remaining hosts for this loop 32134 1727204436.78022: getting the next task for host managed-node2 32134 1727204436.78027: done getting next task for host managed-node2 32134 1727204436.78030: ^ task is: TASK: Create tap interface {{ interface }} 32134 1727204436.78033: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204436.78037: getting variables 32134 1727204436.78038: in VariableManager get_vars() 32134 1727204436.78079: Calling all_inventory to load vars for managed-node2 32134 1727204436.78082: Calling groups_inventory to load vars for managed-node2 32134 1727204436.78085: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204436.78097: Calling all_plugins_play to load vars for managed-node2 32134 1727204436.78100: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204436.78104: Calling groups_plugins_play to load vars for managed-node2 32134 1727204436.78263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204436.78450: done with get_vars() 32134 1727204436.78459: done getting variables 32134 1727204436.78515: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 32134 1727204436.78605: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:00:36 -0400 (0:00:00.021) 0:00:11.190 ***** 32134 1727204436.78629: entering _queue_task() for managed-node2/command 32134 1727204436.78841: worker is 1 (out of 1 available) 32134 1727204436.78857: exiting _queue_task() for managed-node2/command 32134 1727204436.78869: done queuing things up, now waiting for results queue to drain 32134 1727204436.78871: waiting for pending results... 32134 1727204436.79044: running TaskExecutor() for managed-node2/TASK: Create tap interface ethtest0 32134 1727204436.79121: in run() - task 12b410aa-8751-753f-5162-0000000001b7 32134 1727204436.79134: variable 'ansible_search_path' from source: unknown 32134 1727204436.79137: variable 'ansible_search_path' from source: unknown 32134 1727204436.79168: calling self._execute() 32134 1727204436.79242: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204436.79249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204436.79259: variable 'omit' from source: magic vars 32134 1727204436.79558: variable 'ansible_distribution_major_version' from source: facts 32134 1727204436.79568: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204436.79768: variable 'type' from source: set_fact 32134 1727204436.79772: variable 'state' from source: include params 32134 1727204436.79778: variable 'interface' from source: set_fact 32134 1727204436.79784: variable 'current_interfaces' from source: set_fact 32134 1727204436.79793: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 32134 1727204436.79799: when evaluation is False, skipping this task 32134 1727204436.79802: _execute() done 32134 1727204436.79807: dumping result to json 32134 1727204436.79818: done dumping result, returning 32134 1727204436.79848: done running TaskExecutor() for managed-node2/TASK: Create tap interface ethtest0 [12b410aa-8751-753f-5162-0000000001b7] 32134 1727204436.79851: sending task result for task 12b410aa-8751-753f-5162-0000000001b7 32134 1727204436.80055: done sending task result for task 12b410aa-8751-753f-5162-0000000001b7 32134 1727204436.80059: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 32134 1727204436.80141: no more pending results, returning what we have 32134 1727204436.80144: results queue empty 32134 1727204436.80145: checking for any_errors_fatal 32134 1727204436.80150: done checking for any_errors_fatal 32134 1727204436.80151: checking for max_fail_percentage 32134 1727204436.80153: done checking for max_fail_percentage 32134 1727204436.80154: checking to see if all hosts have failed and the running result is not ok 32134 1727204436.80155: done checking to see if all hosts have failed 32134 1727204436.80156: getting the remaining hosts for this loop 32134 1727204436.80157: done getting the remaining hosts for this loop 32134 1727204436.80162: getting the next task for host managed-node2 32134 1727204436.80167: done getting next task for host managed-node2 32134 1727204436.80170: ^ task is: TASK: Delete tap interface {{ interface }} 32134 1727204436.80173: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204436.80177: getting variables 32134 1727204436.80178: in VariableManager get_vars() 32134 1727204436.80218: Calling all_inventory to load vars for managed-node2 32134 1727204436.80221: Calling groups_inventory to load vars for managed-node2 32134 1727204436.80225: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204436.80236: Calling all_plugins_play to load vars for managed-node2 32134 1727204436.80239: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204436.80243: Calling groups_plugins_play to load vars for managed-node2 32134 1727204436.80883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204436.81248: done with get_vars() 32134 1727204436.81260: done getting variables 32134 1727204436.81346: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 32134 1727204436.81481: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:00:36 -0400 (0:00:00.028) 0:00:11.219 ***** 32134 1727204436.81530: entering _queue_task() for managed-node2/command 32134 1727204436.81977: worker is 1 (out of 1 available) 32134 1727204436.81998: exiting _queue_task() for managed-node2/command 32134 1727204436.82010: done queuing things up, now waiting for results queue to drain 32134 1727204436.82015: waiting for pending results... 32134 1727204436.82229: running TaskExecutor() for managed-node2/TASK: Delete tap interface ethtest0 32134 1727204436.82369: in run() - task 12b410aa-8751-753f-5162-0000000001b8 32134 1727204436.82390: variable 'ansible_search_path' from source: unknown 32134 1727204436.82394: variable 'ansible_search_path' from source: unknown 32134 1727204436.82443: calling self._execute() 32134 1727204436.82551: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204436.82561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204436.82794: variable 'omit' from source: magic vars 32134 1727204436.83065: variable 'ansible_distribution_major_version' from source: facts 32134 1727204436.83078: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204436.83397: variable 'type' from source: set_fact 32134 1727204436.83403: variable 'state' from source: include params 32134 1727204436.83409: variable 'interface' from source: set_fact 32134 1727204436.83422: variable 'current_interfaces' from source: set_fact 32134 1727204436.83437: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 32134 1727204436.83441: when evaluation is False, skipping this task 32134 1727204436.83443: _execute() done 32134 1727204436.83448: dumping result to json 32134 1727204436.83453: done dumping result, returning 32134 1727204436.83461: done running TaskExecutor() for managed-node2/TASK: Delete tap interface ethtest0 [12b410aa-8751-753f-5162-0000000001b8] 32134 1727204436.83476: sending task result for task 12b410aa-8751-753f-5162-0000000001b8 32134 1727204436.83579: done sending task result for task 12b410aa-8751-753f-5162-0000000001b8 32134 1727204436.83583: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 32134 1727204436.83652: no more pending results, returning what we have 32134 1727204436.83657: results queue empty 32134 1727204436.83658: checking for any_errors_fatal 32134 1727204436.83667: done checking for any_errors_fatal 32134 1727204436.83669: checking for max_fail_percentage 32134 1727204436.83670: done checking for max_fail_percentage 32134 1727204436.83672: checking to see if all hosts have failed and the running result is not ok 32134 1727204436.83673: done checking to see if all hosts have failed 32134 1727204436.83674: getting the remaining hosts for this loop 32134 1727204436.83675: done getting the remaining hosts for this loop 32134 1727204436.83742: getting the next task for host managed-node2 32134 1727204436.83754: done getting next task for host managed-node2 32134 1727204436.83759: ^ task is: TASK: Include the task 'assert_device_present.yml' 32134 1727204436.83763: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204436.83767: getting variables 32134 1727204436.83769: in VariableManager get_vars() 32134 1727204436.83817: Calling all_inventory to load vars for managed-node2 32134 1727204436.83821: Calling groups_inventory to load vars for managed-node2 32134 1727204436.83824: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204436.83841: Calling all_plugins_play to load vars for managed-node2 32134 1727204436.83964: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204436.83970: Calling groups_plugins_play to load vars for managed-node2 32134 1727204436.84347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204436.84709: done with get_vars() 32134 1727204436.84724: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:20 Tuesday 24 September 2024 15:00:36 -0400 (0:00:00.033) 0:00:11.252 ***** 32134 1727204436.84847: entering _queue_task() for managed-node2/include_tasks 32134 1727204436.85164: worker is 1 (out of 1 available) 32134 1727204436.85178: exiting _queue_task() for managed-node2/include_tasks 32134 1727204436.85314: done queuing things up, now waiting for results queue to drain 32134 1727204436.85317: waiting for pending results... 32134 1727204436.85516: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_present.yml' 32134 1727204436.85632: in run() - task 12b410aa-8751-753f-5162-00000000000e 32134 1727204436.85656: variable 'ansible_search_path' from source: unknown 32134 1727204436.85694: calling self._execute() 32134 1727204436.85808: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204436.85819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204436.85831: variable 'omit' from source: magic vars 32134 1727204436.86494: variable 'ansible_distribution_major_version' from source: facts 32134 1727204436.86498: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204436.86501: _execute() done 32134 1727204436.86504: dumping result to json 32134 1727204436.86507: done dumping result, returning 32134 1727204436.86509: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_present.yml' [12b410aa-8751-753f-5162-00000000000e] 32134 1727204436.86511: sending task result for task 12b410aa-8751-753f-5162-00000000000e 32134 1727204436.86583: done sending task result for task 12b410aa-8751-753f-5162-00000000000e 32134 1727204436.86586: WORKER PROCESS EXITING 32134 1727204436.86627: no more pending results, returning what we have 32134 1727204436.86637: in VariableManager get_vars() 32134 1727204436.86683: Calling all_inventory to load vars for managed-node2 32134 1727204436.86687: Calling groups_inventory to load vars for managed-node2 32134 1727204436.86692: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204436.86708: Calling all_plugins_play to load vars for managed-node2 32134 1727204436.86715: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204436.86720: Calling groups_plugins_play to load vars for managed-node2 32134 1727204436.87264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204436.87623: done with get_vars() 32134 1727204436.87633: variable 'ansible_search_path' from source: unknown 32134 1727204436.87647: we have included files to process 32134 1727204436.87648: generating all_blocks data 32134 1727204436.87650: done generating all_blocks data 32134 1727204436.87657: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 32134 1727204436.87659: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 32134 1727204436.87662: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 32134 1727204436.87880: in VariableManager get_vars() 32134 1727204436.87919: done with get_vars() 32134 1727204436.88071: done processing included file 32134 1727204436.88074: iterating over new_blocks loaded from include file 32134 1727204436.88076: in VariableManager get_vars() 32134 1727204436.88096: done with get_vars() 32134 1727204436.88098: filtering new block on tags 32134 1727204436.88136: done filtering new block on tags 32134 1727204436.88139: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 32134 1727204436.88146: extending task lists for all hosts with included blocks 32134 1727204436.90584: done extending task lists 32134 1727204436.90586: done processing included files 32134 1727204436.90587: results queue empty 32134 1727204436.90590: checking for any_errors_fatal 32134 1727204436.90594: done checking for any_errors_fatal 32134 1727204436.90596: checking for max_fail_percentage 32134 1727204436.90597: done checking for max_fail_percentage 32134 1727204436.90598: checking to see if all hosts have failed and the running result is not ok 32134 1727204436.90599: done checking to see if all hosts have failed 32134 1727204436.90600: getting the remaining hosts for this loop 32134 1727204436.90602: done getting the remaining hosts for this loop 32134 1727204436.90605: getting the next task for host managed-node2 32134 1727204436.90617: done getting next task for host managed-node2 32134 1727204436.90620: ^ task is: TASK: Include the task 'get_interface_stat.yml' 32134 1727204436.90630: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204436.90633: getting variables 32134 1727204436.90635: in VariableManager get_vars() 32134 1727204436.90653: Calling all_inventory to load vars for managed-node2 32134 1727204436.90656: Calling groups_inventory to load vars for managed-node2 32134 1727204436.90658: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204436.90666: Calling all_plugins_play to load vars for managed-node2 32134 1727204436.90669: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204436.90673: Calling groups_plugins_play to load vars for managed-node2 32134 1727204436.90946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204436.91306: done with get_vars() 32134 1727204436.91321: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:00:36 -0400 (0:00:00.065) 0:00:11.318 ***** 32134 1727204436.91425: entering _queue_task() for managed-node2/include_tasks 32134 1727204436.92011: worker is 1 (out of 1 available) 32134 1727204436.92025: exiting _queue_task() for managed-node2/include_tasks 32134 1727204436.92037: done queuing things up, now waiting for results queue to drain 32134 1727204436.92039: waiting for pending results... 32134 1727204436.92209: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 32134 1727204436.92282: in run() - task 12b410aa-8751-753f-5162-0000000002bc 32134 1727204436.92298: variable 'ansible_search_path' from source: unknown 32134 1727204436.92302: variable 'ansible_search_path' from source: unknown 32134 1727204436.92395: calling self._execute() 32134 1727204436.92470: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204436.92479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204436.92500: variable 'omit' from source: magic vars 32134 1727204436.93056: variable 'ansible_distribution_major_version' from source: facts 32134 1727204436.93194: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204436.93197: _execute() done 32134 1727204436.93200: dumping result to json 32134 1727204436.93202: done dumping result, returning 32134 1727204436.93204: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-753f-5162-0000000002bc] 32134 1727204436.93206: sending task result for task 12b410aa-8751-753f-5162-0000000002bc 32134 1727204436.93305: no more pending results, returning what we have 32134 1727204436.93318: in VariableManager get_vars() 32134 1727204436.93367: Calling all_inventory to load vars for managed-node2 32134 1727204436.93371: Calling groups_inventory to load vars for managed-node2 32134 1727204436.93374: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204436.93394: Calling all_plugins_play to load vars for managed-node2 32134 1727204436.93398: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204436.93403: Calling groups_plugins_play to load vars for managed-node2 32134 1727204436.93415: done sending task result for task 12b410aa-8751-753f-5162-0000000002bc 32134 1727204436.93420: WORKER PROCESS EXITING 32134 1727204436.93956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204436.94324: done with get_vars() 32134 1727204436.94335: variable 'ansible_search_path' from source: unknown 32134 1727204436.94336: variable 'ansible_search_path' from source: unknown 32134 1727204436.94427: we have included files to process 32134 1727204436.94429: generating all_blocks data 32134 1727204436.94431: done generating all_blocks data 32134 1727204436.94432: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32134 1727204436.94433: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32134 1727204436.94437: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32134 1727204436.95084: done processing included file 32134 1727204436.95086: iterating over new_blocks loaded from include file 32134 1727204436.95097: in VariableManager get_vars() 32134 1727204436.95324: done with get_vars() 32134 1727204436.95327: filtering new block on tags 32134 1727204436.95348: done filtering new block on tags 32134 1727204436.95351: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 32134 1727204436.95357: extending task lists for all hosts with included blocks 32134 1727204436.95717: done extending task lists 32134 1727204436.95719: done processing included files 32134 1727204436.95720: results queue empty 32134 1727204436.95721: checking for any_errors_fatal 32134 1727204436.95724: done checking for any_errors_fatal 32134 1727204436.95726: checking for max_fail_percentage 32134 1727204436.95727: done checking for max_fail_percentage 32134 1727204436.95728: checking to see if all hosts have failed and the running result is not ok 32134 1727204436.95729: done checking to see if all hosts have failed 32134 1727204436.95730: getting the remaining hosts for this loop 32134 1727204436.95732: done getting the remaining hosts for this loop 32134 1727204436.95735: getting the next task for host managed-node2 32134 1727204436.95740: done getting next task for host managed-node2 32134 1727204436.95743: ^ task is: TASK: Get stat for interface {{ interface }} 32134 1727204436.95747: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204436.95821: getting variables 32134 1727204436.95823: in VariableManager get_vars() 32134 1727204436.95838: Calling all_inventory to load vars for managed-node2 32134 1727204436.95841: Calling groups_inventory to load vars for managed-node2 32134 1727204436.95844: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204436.95851: Calling all_plugins_play to load vars for managed-node2 32134 1727204436.95854: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204436.95864: Calling groups_plugins_play to load vars for managed-node2 32134 1727204436.96430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204436.96884: done with get_vars() 32134 1727204436.96897: done getting variables 32134 1727204436.97101: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:00:36 -0400 (0:00:00.057) 0:00:11.375 ***** 32134 1727204436.97147: entering _queue_task() for managed-node2/stat 32134 1727204436.97532: worker is 1 (out of 1 available) 32134 1727204436.97546: exiting _queue_task() for managed-node2/stat 32134 1727204436.97568: done queuing things up, now waiting for results queue to drain 32134 1727204436.97570: waiting for pending results... 32134 1727204436.97747: running TaskExecutor() for managed-node2/TASK: Get stat for interface ethtest0 32134 1727204436.97868: in run() - task 12b410aa-8751-753f-5162-000000000373 32134 1727204436.97987: variable 'ansible_search_path' from source: unknown 32134 1727204436.97997: variable 'ansible_search_path' from source: unknown 32134 1727204436.98004: calling self._execute() 32134 1727204436.98041: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204436.98045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204436.98048: variable 'omit' from source: magic vars 32134 1727204436.98624: variable 'ansible_distribution_major_version' from source: facts 32134 1727204436.98702: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204436.98706: variable 'omit' from source: magic vars 32134 1727204436.98763: variable 'omit' from source: magic vars 32134 1727204436.98876: variable 'interface' from source: set_fact 32134 1727204436.99209: variable 'omit' from source: magic vars 32134 1727204436.99247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204436.99285: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204436.99314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204436.99344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204436.99348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204436.99380: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204436.99387: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204436.99392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204436.99725: Set connection var ansible_timeout to 10 32134 1727204436.99766: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204436.99771: Set connection var ansible_connection to ssh 32134 1727204436.99774: Set connection var ansible_shell_type to sh 32134 1727204436.99776: Set connection var ansible_shell_executable to /bin/sh 32134 1727204436.99779: Set connection var ansible_pipelining to False 32134 1727204436.99781: variable 'ansible_shell_executable' from source: unknown 32134 1727204436.99784: variable 'ansible_connection' from source: unknown 32134 1727204436.99787: variable 'ansible_module_compression' from source: unknown 32134 1727204436.99795: variable 'ansible_shell_type' from source: unknown 32134 1727204437.00251: variable 'ansible_shell_executable' from source: unknown 32134 1727204437.00255: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204437.00257: variable 'ansible_pipelining' from source: unknown 32134 1727204437.00260: variable 'ansible_timeout' from source: unknown 32134 1727204437.00263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204437.00580: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204437.00585: variable 'omit' from source: magic vars 32134 1727204437.00587: starting attempt loop 32134 1727204437.00592: running the handler 32134 1727204437.00595: _low_level_execute_command(): starting 32134 1727204437.00597: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204437.02614: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204437.02648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204437.02660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204437.02750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204437.05207: stdout chunk (state=3): >>>/root <<< 32134 1727204437.05390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204437.05521: stderr chunk (state=3): >>><<< 32134 1727204437.05532: stdout chunk (state=3): >>><<< 32134 1727204437.05561: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204437.05578: _low_level_execute_command(): starting 32134 1727204437.05592: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204437.0556033-32948-169792885042490 `" && echo ansible-tmp-1727204437.0556033-32948-169792885042490="` echo /root/.ansible/tmp/ansible-tmp-1727204437.0556033-32948-169792885042490 `" ) && sleep 0' 32134 1727204437.06126: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204437.06151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204437.06155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204437.06224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204437.06227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204437.06274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204437.09095: stdout chunk (state=3): >>>ansible-tmp-1727204437.0556033-32948-169792885042490=/root/.ansible/tmp/ansible-tmp-1727204437.0556033-32948-169792885042490 <<< 32134 1727204437.09278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204437.09330: stderr chunk (state=3): >>><<< 32134 1727204437.09334: stdout chunk (state=3): >>><<< 32134 1727204437.09353: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204437.0556033-32948-169792885042490=/root/.ansible/tmp/ansible-tmp-1727204437.0556033-32948-169792885042490 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204437.09398: variable 'ansible_module_compression' from source: unknown 32134 1727204437.09451: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 32134 1727204437.09485: variable 'ansible_facts' from source: unknown 32134 1727204437.09558: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204437.0556033-32948-169792885042490/AnsiballZ_stat.py 32134 1727204437.09679: Sending initial data 32134 1727204437.09683: Sent initial data (153 bytes) 32134 1727204437.10148: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204437.10152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204437.10156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204437.10159: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204437.10162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204437.10215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204437.10219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204437.10281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204437.12716: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204437.12769: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204437.12852: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmp8k14h1gu /root/.ansible/tmp/ansible-tmp-1727204437.0556033-32948-169792885042490/AnsiballZ_stat.py <<< 32134 1727204437.12856: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204437.0556033-32948-169792885042490/AnsiballZ_stat.py" <<< 32134 1727204437.12902: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmp8k14h1gu" to remote "/root/.ansible/tmp/ansible-tmp-1727204437.0556033-32948-169792885042490/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204437.0556033-32948-169792885042490/AnsiballZ_stat.py" <<< 32134 1727204437.14581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204437.14855: stderr chunk (state=3): >>><<< 32134 1727204437.14858: stdout chunk (state=3): >>><<< 32134 1727204437.14861: done transferring module to remote 32134 1727204437.14863: _low_level_execute_command(): starting 32134 1727204437.14866: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204437.0556033-32948-169792885042490/ /root/.ansible/tmp/ansible-tmp-1727204437.0556033-32948-169792885042490/AnsiballZ_stat.py && sleep 0' 32134 1727204437.16025: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204437.16107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204437.16187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204437.16237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204437.16343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204437.19110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204437.19117: stdout chunk (state=3): >>><<< 32134 1727204437.19120: stderr chunk (state=3): >>><<< 32134 1727204437.19242: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204437.19251: _low_level_execute_command(): starting 32134 1727204437.19254: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204437.0556033-32948-169792885042490/AnsiballZ_stat.py && sleep 0' 32134 1727204437.20569: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204437.20669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204437.20743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204437.20839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204437.20955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204437.21239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204437.21276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204437.21380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204437.47014: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38121, "dev": 23, "nlink": 1, "atime": 1727204434.98285, "mtime": 1727204434.98285, "ctime": 1727204434.98285, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 32134 1727204437.48580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204437.48585: stdout chunk (state=3): >>><<< 32134 1727204437.48587: stderr chunk (state=3): >>><<< 32134 1727204437.48593: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38121, "dev": 23, "nlink": 1, "atime": 1727204434.98285, "mtime": 1727204434.98285, "ctime": 1727204434.98285, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204437.48615: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204437.0556033-32948-169792885042490/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204437.48797: _low_level_execute_command(): starting 32134 1727204437.48801: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204437.0556033-32948-169792885042490/ > /dev/null 2>&1 && sleep 0' 32134 1727204437.49372: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204437.49387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204437.49404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204437.49422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204437.49478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204437.49540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204437.49557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204437.49585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204437.49658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204437.52102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204437.52105: stdout chunk (state=3): >>><<< 32134 1727204437.52108: stderr chunk (state=3): >>><<< 32134 1727204437.52117: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204437.52119: handler run complete 32134 1727204437.52121: attempt loop complete, returning result 32134 1727204437.52123: _execute() done 32134 1727204437.52125: dumping result to json 32134 1727204437.52127: done dumping result, returning 32134 1727204437.52129: done running TaskExecutor() for managed-node2/TASK: Get stat for interface ethtest0 [12b410aa-8751-753f-5162-000000000373] 32134 1727204437.52131: sending task result for task 12b410aa-8751-753f-5162-000000000373 32134 1727204437.52215: done sending task result for task 12b410aa-8751-753f-5162-000000000373 32134 1727204437.52219: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204434.98285, "block_size": 4096, "blocks": 0, "ctime": 1727204434.98285, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 38121, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1727204434.98285, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 32134 1727204437.52342: no more pending results, returning what we have 32134 1727204437.52347: results queue empty 32134 1727204437.52348: checking for any_errors_fatal 32134 1727204437.52350: done checking for any_errors_fatal 32134 1727204437.52351: checking for max_fail_percentage 32134 1727204437.52352: done checking for max_fail_percentage 32134 1727204437.52354: checking to see if all hosts have failed and the running result is not ok 32134 1727204437.52355: done checking to see if all hosts have failed 32134 1727204437.52356: getting the remaining hosts for this loop 32134 1727204437.52357: done getting the remaining hosts for this loop 32134 1727204437.52361: getting the next task for host managed-node2 32134 1727204437.52369: done getting next task for host managed-node2 32134 1727204437.52372: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 32134 1727204437.52375: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204437.52379: getting variables 32134 1727204437.52381: in VariableManager get_vars() 32134 1727204437.52429: Calling all_inventory to load vars for managed-node2 32134 1727204437.52433: Calling groups_inventory to load vars for managed-node2 32134 1727204437.52436: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204437.52448: Calling all_plugins_play to load vars for managed-node2 32134 1727204437.52452: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204437.52456: Calling groups_plugins_play to load vars for managed-node2 32134 1727204437.52733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204437.53061: done with get_vars() 32134 1727204437.53076: done getting variables 32134 1727204437.53190: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 32134 1727204437.53375: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:00:37 -0400 (0:00:00.562) 0:00:11.938 ***** 32134 1727204437.53423: entering _queue_task() for managed-node2/assert 32134 1727204437.53425: Creating lock for assert 32134 1727204437.53764: worker is 1 (out of 1 available) 32134 1727204437.53781: exiting _queue_task() for managed-node2/assert 32134 1727204437.53797: done queuing things up, now waiting for results queue to drain 32134 1727204437.53799: waiting for pending results... 32134 1727204437.54174: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'ethtest0' 32134 1727204437.54229: in run() - task 12b410aa-8751-753f-5162-0000000002bd 32134 1727204437.54299: variable 'ansible_search_path' from source: unknown 32134 1727204437.54303: variable 'ansible_search_path' from source: unknown 32134 1727204437.54395: calling self._execute() 32134 1727204437.54434: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204437.54448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204437.54464: variable 'omit' from source: magic vars 32134 1727204437.55043: variable 'ansible_distribution_major_version' from source: facts 32134 1727204437.55073: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204437.55085: variable 'omit' from source: magic vars 32134 1727204437.55141: variable 'omit' from source: magic vars 32134 1727204437.55279: variable 'interface' from source: set_fact 32134 1727204437.55315: variable 'omit' from source: magic vars 32134 1727204437.55388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204437.55421: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204437.55595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204437.55599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204437.55601: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204437.55604: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204437.55606: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204437.55608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204437.55692: Set connection var ansible_timeout to 10 32134 1727204437.55718: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204437.55739: Set connection var ansible_connection to ssh 32134 1727204437.55748: Set connection var ansible_shell_type to sh 32134 1727204437.55761: Set connection var ansible_shell_executable to /bin/sh 32134 1727204437.55773: Set connection var ansible_pipelining to False 32134 1727204437.55840: variable 'ansible_shell_executable' from source: unknown 32134 1727204437.55849: variable 'ansible_connection' from source: unknown 32134 1727204437.55852: variable 'ansible_module_compression' from source: unknown 32134 1727204437.55854: variable 'ansible_shell_type' from source: unknown 32134 1727204437.55856: variable 'ansible_shell_executable' from source: unknown 32134 1727204437.55858: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204437.55863: variable 'ansible_pipelining' from source: unknown 32134 1727204437.55872: variable 'ansible_timeout' from source: unknown 32134 1727204437.55950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204437.56088: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204437.56109: variable 'omit' from source: magic vars 32134 1727204437.56124: starting attempt loop 32134 1727204437.56132: running the handler 32134 1727204437.56330: variable 'interface_stat' from source: set_fact 32134 1727204437.56360: Evaluated conditional (interface_stat.stat.exists): True 32134 1727204437.56374: handler run complete 32134 1727204437.56415: attempt loop complete, returning result 32134 1727204437.56425: _execute() done 32134 1727204437.56433: dumping result to json 32134 1727204437.56496: done dumping result, returning 32134 1727204437.56506: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'ethtest0' [12b410aa-8751-753f-5162-0000000002bd] 32134 1727204437.56509: sending task result for task 12b410aa-8751-753f-5162-0000000002bd ok: [managed-node2] => { "changed": false } MSG: All assertions passed 32134 1727204437.56669: no more pending results, returning what we have 32134 1727204437.56674: results queue empty 32134 1727204437.56675: checking for any_errors_fatal 32134 1727204437.56684: done checking for any_errors_fatal 32134 1727204437.56685: checking for max_fail_percentage 32134 1727204437.56687: done checking for max_fail_percentage 32134 1727204437.56688: checking to see if all hosts have failed and the running result is not ok 32134 1727204437.56691: done checking to see if all hosts have failed 32134 1727204437.56692: getting the remaining hosts for this loop 32134 1727204437.56695: done getting the remaining hosts for this loop 32134 1727204437.56700: getting the next task for host managed-node2 32134 1727204437.56896: done getting next task for host managed-node2 32134 1727204437.56900: ^ task is: TASK: Initialize the connection_failed flag 32134 1727204437.56903: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204437.56907: getting variables 32134 1727204437.56908: in VariableManager get_vars() 32134 1727204437.56947: Calling all_inventory to load vars for managed-node2 32134 1727204437.56950: Calling groups_inventory to load vars for managed-node2 32134 1727204437.56953: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204437.56964: Calling all_plugins_play to load vars for managed-node2 32134 1727204437.56967: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204437.56970: Calling groups_plugins_play to load vars for managed-node2 32134 1727204437.57388: done sending task result for task 12b410aa-8751-753f-5162-0000000002bd 32134 1727204437.57393: WORKER PROCESS EXITING 32134 1727204437.57423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204437.57779: done with get_vars() 32134 1727204437.57797: done getting variables 32134 1727204437.57874: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize the connection_failed flag] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:23 Tuesday 24 September 2024 15:00:37 -0400 (0:00:00.044) 0:00:11.983 ***** 32134 1727204437.57907: entering _queue_task() for managed-node2/set_fact 32134 1727204437.58309: worker is 1 (out of 1 available) 32134 1727204437.58329: exiting _queue_task() for managed-node2/set_fact 32134 1727204437.58343: done queuing things up, now waiting for results queue to drain 32134 1727204437.58345: waiting for pending results... 32134 1727204437.58562: running TaskExecutor() for managed-node2/TASK: Initialize the connection_failed flag 32134 1727204437.58683: in run() - task 12b410aa-8751-753f-5162-00000000000f 32134 1727204437.58706: variable 'ansible_search_path' from source: unknown 32134 1727204437.58764: calling self._execute() 32134 1727204437.58881: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204437.58900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204437.58921: variable 'omit' from source: magic vars 32134 1727204437.59403: variable 'ansible_distribution_major_version' from source: facts 32134 1727204437.59429: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204437.59447: variable 'omit' from source: magic vars 32134 1727204437.59476: variable 'omit' from source: magic vars 32134 1727204437.59539: variable 'omit' from source: magic vars 32134 1727204437.59597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204437.59658: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204437.59688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204437.59727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204437.59775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204437.59801: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204437.59811: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204437.59836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204437.59993: Set connection var ansible_timeout to 10 32134 1727204437.60005: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204437.60017: Set connection var ansible_connection to ssh 32134 1727204437.60054: Set connection var ansible_shell_type to sh 32134 1727204437.60057: Set connection var ansible_shell_executable to /bin/sh 32134 1727204437.60064: Set connection var ansible_pipelining to False 32134 1727204437.60088: variable 'ansible_shell_executable' from source: unknown 32134 1727204437.60105: variable 'ansible_connection' from source: unknown 32134 1727204437.60163: variable 'ansible_module_compression' from source: unknown 32134 1727204437.60171: variable 'ansible_shell_type' from source: unknown 32134 1727204437.60174: variable 'ansible_shell_executable' from source: unknown 32134 1727204437.60176: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204437.60179: variable 'ansible_pipelining' from source: unknown 32134 1727204437.60181: variable 'ansible_timeout' from source: unknown 32134 1727204437.60184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204437.60355: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204437.60373: variable 'omit' from source: magic vars 32134 1727204437.60394: starting attempt loop 32134 1727204437.60425: running the handler 32134 1727204437.60428: handler run complete 32134 1727204437.60442: attempt loop complete, returning result 32134 1727204437.60449: _execute() done 32134 1727204437.60456: dumping result to json 32134 1727204437.60464: done dumping result, returning 32134 1727204437.60489: done running TaskExecutor() for managed-node2/TASK: Initialize the connection_failed flag [12b410aa-8751-753f-5162-00000000000f] 32134 1727204437.60493: sending task result for task 12b410aa-8751-753f-5162-00000000000f 32134 1727204437.60664: done sending task result for task 12b410aa-8751-753f-5162-00000000000f 32134 1727204437.60668: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "connection_failed": false }, "changed": false } 32134 1727204437.60745: no more pending results, returning what we have 32134 1727204437.60749: results queue empty 32134 1727204437.60751: checking for any_errors_fatal 32134 1727204437.60758: done checking for any_errors_fatal 32134 1727204437.60759: checking for max_fail_percentage 32134 1727204437.60761: done checking for max_fail_percentage 32134 1727204437.60762: checking to see if all hosts have failed and the running result is not ok 32134 1727204437.60763: done checking to see if all hosts have failed 32134 1727204437.60764: getting the remaining hosts for this loop 32134 1727204437.60765: done getting the remaining hosts for this loop 32134 1727204437.60770: getting the next task for host managed-node2 32134 1727204437.60968: done getting next task for host managed-node2 32134 1727204437.60974: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 32134 1727204437.60977: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204437.60996: getting variables 32134 1727204437.60998: in VariableManager get_vars() 32134 1727204437.61038: Calling all_inventory to load vars for managed-node2 32134 1727204437.61042: Calling groups_inventory to load vars for managed-node2 32134 1727204437.61045: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204437.61055: Calling all_plugins_play to load vars for managed-node2 32134 1727204437.61059: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204437.61063: Calling groups_plugins_play to load vars for managed-node2 32134 1727204437.61346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204437.61749: done with get_vars() 32134 1727204437.61763: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:00:37 -0400 (0:00:00.039) 0:00:12.022 ***** 32134 1727204437.61884: entering _queue_task() for managed-node2/include_tasks 32134 1727204437.62393: worker is 1 (out of 1 available) 32134 1727204437.62403: exiting _queue_task() for managed-node2/include_tasks 32134 1727204437.62417: done queuing things up, now waiting for results queue to drain 32134 1727204437.62419: waiting for pending results... 32134 1727204437.62610: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 32134 1727204437.62708: in run() - task 12b410aa-8751-753f-5162-000000000017 32134 1727204437.62714: variable 'ansible_search_path' from source: unknown 32134 1727204437.62717: variable 'ansible_search_path' from source: unknown 32134 1727204437.62736: calling self._execute() 32134 1727204437.62843: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204437.62864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204437.62881: variable 'omit' from source: magic vars 32134 1727204437.63340: variable 'ansible_distribution_major_version' from source: facts 32134 1727204437.63362: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204437.63409: _execute() done 32134 1727204437.63415: dumping result to json 32134 1727204437.63418: done dumping result, returning 32134 1727204437.63421: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-753f-5162-000000000017] 32134 1727204437.63423: sending task result for task 12b410aa-8751-753f-5162-000000000017 32134 1727204437.63572: done sending task result for task 12b410aa-8751-753f-5162-000000000017 32134 1727204437.63575: WORKER PROCESS EXITING 32134 1727204437.63739: no more pending results, returning what we have 32134 1727204437.63745: in VariableManager get_vars() 32134 1727204437.63785: Calling all_inventory to load vars for managed-node2 32134 1727204437.63788: Calling groups_inventory to load vars for managed-node2 32134 1727204437.63794: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204437.63804: Calling all_plugins_play to load vars for managed-node2 32134 1727204437.63807: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204437.63814: Calling groups_plugins_play to load vars for managed-node2 32134 1727204437.64142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204437.64503: done with get_vars() 32134 1727204437.64515: variable 'ansible_search_path' from source: unknown 32134 1727204437.64516: variable 'ansible_search_path' from source: unknown 32134 1727204437.64562: we have included files to process 32134 1727204437.64564: generating all_blocks data 32134 1727204437.64566: done generating all_blocks data 32134 1727204437.64571: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32134 1727204437.64572: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32134 1727204437.64574: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32134 1727204437.65571: done processing included file 32134 1727204437.65573: iterating over new_blocks loaded from include file 32134 1727204437.65575: in VariableManager get_vars() 32134 1727204437.65610: done with get_vars() 32134 1727204437.65615: filtering new block on tags 32134 1727204437.65637: done filtering new block on tags 32134 1727204437.65641: in VariableManager get_vars() 32134 1727204437.65666: done with get_vars() 32134 1727204437.65668: filtering new block on tags 32134 1727204437.65704: done filtering new block on tags 32134 1727204437.65707: in VariableManager get_vars() 32134 1727204437.65736: done with get_vars() 32134 1727204437.65738: filtering new block on tags 32134 1727204437.65762: done filtering new block on tags 32134 1727204437.65765: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 32134 1727204437.65771: extending task lists for all hosts with included blocks 32134 1727204437.66983: done extending task lists 32134 1727204437.66985: done processing included files 32134 1727204437.66986: results queue empty 32134 1727204437.66987: checking for any_errors_fatal 32134 1727204437.66992: done checking for any_errors_fatal 32134 1727204437.66993: checking for max_fail_percentage 32134 1727204437.66994: done checking for max_fail_percentage 32134 1727204437.66995: checking to see if all hosts have failed and the running result is not ok 32134 1727204437.66997: done checking to see if all hosts have failed 32134 1727204437.66998: getting the remaining hosts for this loop 32134 1727204437.66999: done getting the remaining hosts for this loop 32134 1727204437.67009: getting the next task for host managed-node2 32134 1727204437.67016: done getting next task for host managed-node2 32134 1727204437.67019: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 32134 1727204437.67023: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204437.67034: getting variables 32134 1727204437.67035: in VariableManager get_vars() 32134 1727204437.67051: Calling all_inventory to load vars for managed-node2 32134 1727204437.67054: Calling groups_inventory to load vars for managed-node2 32134 1727204437.67056: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204437.67062: Calling all_plugins_play to load vars for managed-node2 32134 1727204437.67066: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204437.67069: Calling groups_plugins_play to load vars for managed-node2 32134 1727204437.67339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204437.67684: done with get_vars() 32134 1727204437.67697: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:00:37 -0400 (0:00:00.058) 0:00:12.081 ***** 32134 1727204437.67782: entering _queue_task() for managed-node2/setup 32134 1727204437.68088: worker is 1 (out of 1 available) 32134 1727204437.68305: exiting _queue_task() for managed-node2/setup 32134 1727204437.68319: done queuing things up, now waiting for results queue to drain 32134 1727204437.68321: waiting for pending results... 32134 1727204437.68510: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 32134 1727204437.68696: in run() - task 12b410aa-8751-753f-5162-00000000038e 32134 1727204437.68700: variable 'ansible_search_path' from source: unknown 32134 1727204437.68703: variable 'ansible_search_path' from source: unknown 32134 1727204437.68736: calling self._execute() 32134 1727204437.68852: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204437.68874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204437.68935: variable 'omit' from source: magic vars 32134 1727204437.69368: variable 'ansible_distribution_major_version' from source: facts 32134 1727204437.69388: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204437.69719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204437.71908: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204437.71974: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204437.72008: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204437.72043: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204437.72066: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204437.72141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204437.72167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204437.72187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204437.72229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204437.72241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204437.72292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204437.72312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204437.72338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204437.72373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204437.72386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204437.72523: variable '__network_required_facts' from source: role '' defaults 32134 1727204437.72533: variable 'ansible_facts' from source: unknown 32134 1727204437.72616: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 32134 1727204437.72620: when evaluation is False, skipping this task 32134 1727204437.72625: _execute() done 32134 1727204437.72629: dumping result to json 32134 1727204437.72634: done dumping result, returning 32134 1727204437.72641: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-753f-5162-00000000038e] 32134 1727204437.72646: sending task result for task 12b410aa-8751-753f-5162-00000000038e skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32134 1727204437.72792: no more pending results, returning what we have 32134 1727204437.72797: results queue empty 32134 1727204437.72798: checking for any_errors_fatal 32134 1727204437.72800: done checking for any_errors_fatal 32134 1727204437.72801: checking for max_fail_percentage 32134 1727204437.72802: done checking for max_fail_percentage 32134 1727204437.72803: checking to see if all hosts have failed and the running result is not ok 32134 1727204437.72804: done checking to see if all hosts have failed 32134 1727204437.72805: getting the remaining hosts for this loop 32134 1727204437.72806: done getting the remaining hosts for this loop 32134 1727204437.72812: getting the next task for host managed-node2 32134 1727204437.72823: done getting next task for host managed-node2 32134 1727204437.72828: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 32134 1727204437.72832: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204437.72847: getting variables 32134 1727204437.72848: in VariableManager get_vars() 32134 1727204437.72888: Calling all_inventory to load vars for managed-node2 32134 1727204437.72893: Calling groups_inventory to load vars for managed-node2 32134 1727204437.72895: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204437.72905: Calling all_plugins_play to load vars for managed-node2 32134 1727204437.72908: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204437.72911: Calling groups_plugins_play to load vars for managed-node2 32134 1727204437.73098: done sending task result for task 12b410aa-8751-753f-5162-00000000038e 32134 1727204437.73102: WORKER PROCESS EXITING 32134 1727204437.73117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204437.73311: done with get_vars() 32134 1727204437.73324: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:00:37 -0400 (0:00:00.056) 0:00:12.138 ***** 32134 1727204437.73419: entering _queue_task() for managed-node2/stat 32134 1727204437.73654: worker is 1 (out of 1 available) 32134 1727204437.73667: exiting _queue_task() for managed-node2/stat 32134 1727204437.73682: done queuing things up, now waiting for results queue to drain 32134 1727204437.73684: waiting for pending results... 32134 1727204437.74097: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 32134 1727204437.74203: in run() - task 12b410aa-8751-753f-5162-000000000390 32134 1727204437.74234: variable 'ansible_search_path' from source: unknown 32134 1727204437.74294: variable 'ansible_search_path' from source: unknown 32134 1727204437.74298: calling self._execute() 32134 1727204437.74386: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204437.74403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204437.74421: variable 'omit' from source: magic vars 32134 1727204437.74836: variable 'ansible_distribution_major_version' from source: facts 32134 1727204437.74846: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204437.75048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204437.75263: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204437.75308: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204437.75335: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204437.75363: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204437.75439: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204437.75459: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204437.75480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204437.75504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204437.75578: variable '__network_is_ostree' from source: set_fact 32134 1727204437.75585: Evaluated conditional (not __network_is_ostree is defined): False 32134 1727204437.75588: when evaluation is False, skipping this task 32134 1727204437.75594: _execute() done 32134 1727204437.75598: dumping result to json 32134 1727204437.75603: done dumping result, returning 32134 1727204437.75610: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-753f-5162-000000000390] 32134 1727204437.75617: sending task result for task 12b410aa-8751-753f-5162-000000000390 32134 1727204437.75708: done sending task result for task 12b410aa-8751-753f-5162-000000000390 32134 1727204437.75714: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 32134 1727204437.75778: no more pending results, returning what we have 32134 1727204437.75782: results queue empty 32134 1727204437.75783: checking for any_errors_fatal 32134 1727204437.75791: done checking for any_errors_fatal 32134 1727204437.75792: checking for max_fail_percentage 32134 1727204437.75794: done checking for max_fail_percentage 32134 1727204437.75795: checking to see if all hosts have failed and the running result is not ok 32134 1727204437.75796: done checking to see if all hosts have failed 32134 1727204437.75797: getting the remaining hosts for this loop 32134 1727204437.75798: done getting the remaining hosts for this loop 32134 1727204437.75802: getting the next task for host managed-node2 32134 1727204437.75808: done getting next task for host managed-node2 32134 1727204437.75814: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 32134 1727204437.75818: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204437.75834: getting variables 32134 1727204437.75835: in VariableManager get_vars() 32134 1727204437.75869: Calling all_inventory to load vars for managed-node2 32134 1727204437.75872: Calling groups_inventory to load vars for managed-node2 32134 1727204437.75876: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204437.75883: Calling all_plugins_play to load vars for managed-node2 32134 1727204437.75885: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204437.75887: Calling groups_plugins_play to load vars for managed-node2 32134 1727204437.76072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204437.76263: done with get_vars() 32134 1727204437.76271: done getting variables 32134 1727204437.76324: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:00:37 -0400 (0:00:00.029) 0:00:12.167 ***** 32134 1727204437.76352: entering _queue_task() for managed-node2/set_fact 32134 1727204437.76545: worker is 1 (out of 1 available) 32134 1727204437.76559: exiting _queue_task() for managed-node2/set_fact 32134 1727204437.76571: done queuing things up, now waiting for results queue to drain 32134 1727204437.76574: waiting for pending results... 32134 1727204437.76733: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 32134 1727204437.76844: in run() - task 12b410aa-8751-753f-5162-000000000391 32134 1727204437.76855: variable 'ansible_search_path' from source: unknown 32134 1727204437.76858: variable 'ansible_search_path' from source: unknown 32134 1727204437.76888: calling self._execute() 32134 1727204437.76958: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204437.76964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204437.76974: variable 'omit' from source: magic vars 32134 1727204437.77285: variable 'ansible_distribution_major_version' from source: facts 32134 1727204437.77297: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204437.77438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204437.77658: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204437.77700: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204437.77748: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204437.77776: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204437.77898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204437.77995: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204437.77999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204437.78004: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204437.78195: variable '__network_is_ostree' from source: set_fact 32134 1727204437.78198: Evaluated conditional (not __network_is_ostree is defined): False 32134 1727204437.78200: when evaluation is False, skipping this task 32134 1727204437.78202: _execute() done 32134 1727204437.78204: dumping result to json 32134 1727204437.78206: done dumping result, returning 32134 1727204437.78209: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-753f-5162-000000000391] 32134 1727204437.78211: sending task result for task 12b410aa-8751-753f-5162-000000000391 32134 1727204437.78274: done sending task result for task 12b410aa-8751-753f-5162-000000000391 32134 1727204437.78277: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 32134 1727204437.78332: no more pending results, returning what we have 32134 1727204437.78336: results queue empty 32134 1727204437.78337: checking for any_errors_fatal 32134 1727204437.78341: done checking for any_errors_fatal 32134 1727204437.78342: checking for max_fail_percentage 32134 1727204437.78344: done checking for max_fail_percentage 32134 1727204437.78345: checking to see if all hosts have failed and the running result is not ok 32134 1727204437.78347: done checking to see if all hosts have failed 32134 1727204437.78347: getting the remaining hosts for this loop 32134 1727204437.78349: done getting the remaining hosts for this loop 32134 1727204437.78353: getting the next task for host managed-node2 32134 1727204437.78361: done getting next task for host managed-node2 32134 1727204437.78365: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 32134 1727204437.78370: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204437.78384: getting variables 32134 1727204437.78386: in VariableManager get_vars() 32134 1727204437.78436: Calling all_inventory to load vars for managed-node2 32134 1727204437.78439: Calling groups_inventory to load vars for managed-node2 32134 1727204437.78442: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204437.78451: Calling all_plugins_play to load vars for managed-node2 32134 1727204437.78454: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204437.78458: Calling groups_plugins_play to load vars for managed-node2 32134 1727204437.78742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204437.79109: done with get_vars() 32134 1727204437.79122: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:00:37 -0400 (0:00:00.028) 0:00:12.196 ***** 32134 1727204437.79244: entering _queue_task() for managed-node2/service_facts 32134 1727204437.79246: Creating lock for service_facts 32134 1727204437.79637: worker is 1 (out of 1 available) 32134 1727204437.79651: exiting _queue_task() for managed-node2/service_facts 32134 1727204437.79667: done queuing things up, now waiting for results queue to drain 32134 1727204437.79670: waiting for pending results... 32134 1727204437.79849: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 32134 1727204437.79959: in run() - task 12b410aa-8751-753f-5162-000000000393 32134 1727204437.79972: variable 'ansible_search_path' from source: unknown 32134 1727204437.79976: variable 'ansible_search_path' from source: unknown 32134 1727204437.80012: calling self._execute() 32134 1727204437.80091: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204437.80097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204437.80107: variable 'omit' from source: magic vars 32134 1727204437.80467: variable 'ansible_distribution_major_version' from source: facts 32134 1727204437.80477: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204437.80484: variable 'omit' from source: magic vars 32134 1727204437.80550: variable 'omit' from source: magic vars 32134 1727204437.80581: variable 'omit' from source: magic vars 32134 1727204437.80615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204437.80647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204437.80666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204437.80685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204437.80698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204437.80727: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204437.80731: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204437.80736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204437.80825: Set connection var ansible_timeout to 10 32134 1727204437.80837: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204437.80840: Set connection var ansible_connection to ssh 32134 1727204437.80843: Set connection var ansible_shell_type to sh 32134 1727204437.80850: Set connection var ansible_shell_executable to /bin/sh 32134 1727204437.80856: Set connection var ansible_pipelining to False 32134 1727204437.80875: variable 'ansible_shell_executable' from source: unknown 32134 1727204437.80880: variable 'ansible_connection' from source: unknown 32134 1727204437.80883: variable 'ansible_module_compression' from source: unknown 32134 1727204437.80886: variable 'ansible_shell_type' from source: unknown 32134 1727204437.80889: variable 'ansible_shell_executable' from source: unknown 32134 1727204437.80897: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204437.80902: variable 'ansible_pipelining' from source: unknown 32134 1727204437.80905: variable 'ansible_timeout' from source: unknown 32134 1727204437.80907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204437.81076: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204437.81086: variable 'omit' from source: magic vars 32134 1727204437.81093: starting attempt loop 32134 1727204437.81097: running the handler 32134 1727204437.81115: _low_level_execute_command(): starting 32134 1727204437.81122: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204437.81671: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204437.81675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204437.81678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204437.81680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204437.81736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204437.81743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204437.81787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204437.83565: stdout chunk (state=3): >>>/root <<< 32134 1727204437.83694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204437.83778: stderr chunk (state=3): >>><<< 32134 1727204437.83781: stdout chunk (state=3): >>><<< 32134 1727204437.83815: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204437.83833: _low_level_execute_command(): starting 32134 1727204437.83849: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204437.8381658-32996-106537528914519 `" && echo ansible-tmp-1727204437.8381658-32996-106537528914519="` echo /root/.ansible/tmp/ansible-tmp-1727204437.8381658-32996-106537528914519 `" ) && sleep 0' 32134 1727204437.84542: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204437.84546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204437.84561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204437.84575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204437.84593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204437.84598: stderr chunk (state=3): >>>debug2: match not found <<< 32134 1727204437.84608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204437.84657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204437.84740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204437.84779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204437.84825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204437.86865: stdout chunk (state=3): >>>ansible-tmp-1727204437.8381658-32996-106537528914519=/root/.ansible/tmp/ansible-tmp-1727204437.8381658-32996-106537528914519 <<< 32134 1727204437.86981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204437.87033: stderr chunk (state=3): >>><<< 32134 1727204437.87036: stdout chunk (state=3): >>><<< 32134 1727204437.87052: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204437.8381658-32996-106537528914519=/root/.ansible/tmp/ansible-tmp-1727204437.8381658-32996-106537528914519 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204437.87093: variable 'ansible_module_compression' from source: unknown 32134 1727204437.87139: ANSIBALLZ: Using lock for service_facts 32134 1727204437.87143: ANSIBALLZ: Acquiring lock 32134 1727204437.87147: ANSIBALLZ: Lock acquired: 140589347851824 32134 1727204437.87149: ANSIBALLZ: Creating module 32134 1727204437.98503: ANSIBALLZ: Writing module into payload 32134 1727204437.98586: ANSIBALLZ: Writing module 32134 1727204437.98607: ANSIBALLZ: Renaming module 32134 1727204437.98617: ANSIBALLZ: Done creating module 32134 1727204437.98631: variable 'ansible_facts' from source: unknown 32134 1727204437.98677: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204437.8381658-32996-106537528914519/AnsiballZ_service_facts.py 32134 1727204437.98796: Sending initial data 32134 1727204437.98799: Sent initial data (162 bytes) 32134 1727204437.99305: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204437.99309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204437.99312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32134 1727204437.99314: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204437.99317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204437.99363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204437.99382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204437.99433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204438.01186: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204438.01220: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204438.01260: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpanyjv760 /root/.ansible/tmp/ansible-tmp-1727204437.8381658-32996-106537528914519/AnsiballZ_service_facts.py <<< 32134 1727204438.01262: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204437.8381658-32996-106537528914519/AnsiballZ_service_facts.py" <<< 32134 1727204438.01295: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpanyjv760" to remote "/root/.ansible/tmp/ansible-tmp-1727204437.8381658-32996-106537528914519/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204437.8381658-32996-106537528914519/AnsiballZ_service_facts.py" <<< 32134 1727204438.02100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204438.02174: stderr chunk (state=3): >>><<< 32134 1727204438.02177: stdout chunk (state=3): >>><<< 32134 1727204438.02199: done transferring module to remote 32134 1727204438.02210: _low_level_execute_command(): starting 32134 1727204438.02218: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204437.8381658-32996-106537528914519/ /root/.ansible/tmp/ansible-tmp-1727204437.8381658-32996-106537528914519/AnsiballZ_service_facts.py && sleep 0' 32134 1727204438.02672: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204438.02696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204438.02700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204438.02702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32134 1727204438.02710: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204438.02724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204438.02785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204438.02788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204438.02831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204438.04722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204438.04770: stderr chunk (state=3): >>><<< 32134 1727204438.04773: stdout chunk (state=3): >>><<< 32134 1727204438.04796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204438.04801: _low_level_execute_command(): starting 32134 1727204438.04806: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204437.8381658-32996-106537528914519/AnsiballZ_service_facts.py && sleep 0' 32134 1727204438.05271: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204438.05275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204438.05277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204438.05280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204438.05282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204438.05343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204438.05346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204438.05387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204440.14396: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "seria<<< 32134 1727204440.15249: stdout chunk (state=3): >>>l-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 32134 1727204440.16132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204440.16203: stderr chunk (state=3): >>><<< 32134 1727204440.16239: stdout chunk (state=3): >>><<< 32134 1727204440.16599: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204440.17959: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204437.8381658-32996-106537528914519/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204440.17981: _low_level_execute_command(): starting 32134 1727204440.17997: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204437.8381658-32996-106537528914519/ > /dev/null 2>&1 && sleep 0' 32134 1727204440.18601: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204440.18608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204440.18641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204440.18644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204440.18647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204440.18649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204440.18703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204440.18710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204440.18722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204440.18778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204440.35943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204440.36003: stderr chunk (state=3): >>><<< 32134 1727204440.36007: stdout chunk (state=3): >>><<< 32134 1727204440.36022: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204440.36029: handler run complete 32134 1727204440.36202: variable 'ansible_facts' from source: unknown 32134 1727204440.36344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204440.36765: variable 'ansible_facts' from source: unknown 32134 1727204440.36894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204440.37092: attempt loop complete, returning result 32134 1727204440.37098: _execute() done 32134 1727204440.37101: dumping result to json 32134 1727204440.37150: done dumping result, returning 32134 1727204440.37161: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-753f-5162-000000000393] 32134 1727204440.37164: sending task result for task 12b410aa-8751-753f-5162-000000000393 32134 1727204440.38063: done sending task result for task 12b410aa-8751-753f-5162-000000000393 32134 1727204440.38066: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32134 1727204440.38110: no more pending results, returning what we have 32134 1727204440.38113: results queue empty 32134 1727204440.38114: checking for any_errors_fatal 32134 1727204440.38116: done checking for any_errors_fatal 32134 1727204440.38117: checking for max_fail_percentage 32134 1727204440.38118: done checking for max_fail_percentage 32134 1727204440.38118: checking to see if all hosts have failed and the running result is not ok 32134 1727204440.38119: done checking to see if all hosts have failed 32134 1727204440.38120: getting the remaining hosts for this loop 32134 1727204440.38121: done getting the remaining hosts for this loop 32134 1727204440.38123: getting the next task for host managed-node2 32134 1727204440.38127: done getting next task for host managed-node2 32134 1727204440.38129: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 32134 1727204440.38132: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204440.38139: getting variables 32134 1727204440.38140: in VariableManager get_vars() 32134 1727204440.38163: Calling all_inventory to load vars for managed-node2 32134 1727204440.38165: Calling groups_inventory to load vars for managed-node2 32134 1727204440.38166: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204440.38174: Calling all_plugins_play to load vars for managed-node2 32134 1727204440.38176: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204440.38178: Calling groups_plugins_play to load vars for managed-node2 32134 1727204440.38500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204440.38939: done with get_vars() 32134 1727204440.38951: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:00:40 -0400 (0:00:02.597) 0:00:14.794 ***** 32134 1727204440.39035: entering _queue_task() for managed-node2/package_facts 32134 1727204440.39036: Creating lock for package_facts 32134 1727204440.39258: worker is 1 (out of 1 available) 32134 1727204440.39276: exiting _queue_task() for managed-node2/package_facts 32134 1727204440.39288: done queuing things up, now waiting for results queue to drain 32134 1727204440.39292: waiting for pending results... 32134 1727204440.39492: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 32134 1727204440.39605: in run() - task 12b410aa-8751-753f-5162-000000000394 32134 1727204440.39622: variable 'ansible_search_path' from source: unknown 32134 1727204440.39626: variable 'ansible_search_path' from source: unknown 32134 1727204440.39659: calling self._execute() 32134 1727204440.39742: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204440.39749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204440.39760: variable 'omit' from source: magic vars 32134 1727204440.40086: variable 'ansible_distribution_major_version' from source: facts 32134 1727204440.40093: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204440.40101: variable 'omit' from source: magic vars 32134 1727204440.40162: variable 'omit' from source: magic vars 32134 1727204440.40198: variable 'omit' from source: magic vars 32134 1727204440.40233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204440.40263: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204440.40285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204440.40304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204440.40319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204440.40347: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204440.40350: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204440.40355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204440.40443: Set connection var ansible_timeout to 10 32134 1727204440.40454: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204440.40457: Set connection var ansible_connection to ssh 32134 1727204440.40460: Set connection var ansible_shell_type to sh 32134 1727204440.40467: Set connection var ansible_shell_executable to /bin/sh 32134 1727204440.40474: Set connection var ansible_pipelining to False 32134 1727204440.40493: variable 'ansible_shell_executable' from source: unknown 32134 1727204440.40496: variable 'ansible_connection' from source: unknown 32134 1727204440.40501: variable 'ansible_module_compression' from source: unknown 32134 1727204440.40504: variable 'ansible_shell_type' from source: unknown 32134 1727204440.40507: variable 'ansible_shell_executable' from source: unknown 32134 1727204440.40513: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204440.40522: variable 'ansible_pipelining' from source: unknown 32134 1727204440.40525: variable 'ansible_timeout' from source: unknown 32134 1727204440.40528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204440.40695: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204440.40706: variable 'omit' from source: magic vars 32134 1727204440.40712: starting attempt loop 32134 1727204440.40718: running the handler 32134 1727204440.40731: _low_level_execute_command(): starting 32134 1727204440.40744: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204440.41438: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204440.41442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204440.41445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204440.41496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204440.41499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204440.41502: stderr chunk (state=3): >>>debug2: match not found <<< 32134 1727204440.41504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204440.41507: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32134 1727204440.41509: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 32134 1727204440.41520: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32134 1727204440.41529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204440.41545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204440.41555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204440.41563: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204440.41571: stderr chunk (state=3): >>>debug2: match found <<< 32134 1727204440.41581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204440.41662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204440.41675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204440.41700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204440.41762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204440.43557: stdout chunk (state=3): >>>/root <<< 32134 1727204440.43665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204440.43724: stderr chunk (state=3): >>><<< 32134 1727204440.43728: stdout chunk (state=3): >>><<< 32134 1727204440.43749: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204440.43761: _low_level_execute_command(): starting 32134 1727204440.43769: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204440.4374888-33101-239381404421935 `" && echo ansible-tmp-1727204440.4374888-33101-239381404421935="` echo /root/.ansible/tmp/ansible-tmp-1727204440.4374888-33101-239381404421935 `" ) && sleep 0' 32134 1727204440.44259: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204440.44262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204440.44266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204440.44268: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204440.44271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204440.44313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204440.44317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204440.44370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204440.46430: stdout chunk (state=3): >>>ansible-tmp-1727204440.4374888-33101-239381404421935=/root/.ansible/tmp/ansible-tmp-1727204440.4374888-33101-239381404421935 <<< 32134 1727204440.46549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204440.46594: stderr chunk (state=3): >>><<< 32134 1727204440.46598: stdout chunk (state=3): >>><<< 32134 1727204440.46618: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204440.4374888-33101-239381404421935=/root/.ansible/tmp/ansible-tmp-1727204440.4374888-33101-239381404421935 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204440.46661: variable 'ansible_module_compression' from source: unknown 32134 1727204440.46704: ANSIBALLZ: Using lock for package_facts 32134 1727204440.46707: ANSIBALLZ: Acquiring lock 32134 1727204440.46710: ANSIBALLZ: Lock acquired: 140589351585600 32134 1727204440.46717: ANSIBALLZ: Creating module 32134 1727204440.71294: ANSIBALLZ: Writing module into payload 32134 1727204440.71415: ANSIBALLZ: Writing module 32134 1727204440.71441: ANSIBALLZ: Renaming module 32134 1727204440.71447: ANSIBALLZ: Done creating module 32134 1727204440.71475: variable 'ansible_facts' from source: unknown 32134 1727204440.71619: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204440.4374888-33101-239381404421935/AnsiballZ_package_facts.py 32134 1727204440.71746: Sending initial data 32134 1727204440.71750: Sent initial data (162 bytes) 32134 1727204440.72246: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204440.72250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204440.72254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204440.72256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204440.72259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204440.72317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204440.72320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204440.72376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204440.74127: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 32134 1727204440.74131: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204440.74160: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204440.74209: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpq_txne26 /root/.ansible/tmp/ansible-tmp-1727204440.4374888-33101-239381404421935/AnsiballZ_package_facts.py <<< 32134 1727204440.74213: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204440.4374888-33101-239381404421935/AnsiballZ_package_facts.py" <<< 32134 1727204440.74246: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpq_txne26" to remote "/root/.ansible/tmp/ansible-tmp-1727204440.4374888-33101-239381404421935/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204440.4374888-33101-239381404421935/AnsiballZ_package_facts.py" <<< 32134 1727204440.75936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204440.76007: stderr chunk (state=3): >>><<< 32134 1727204440.76011: stdout chunk (state=3): >>><<< 32134 1727204440.76032: done transferring module to remote 32134 1727204440.76042: _low_level_execute_command(): starting 32134 1727204440.76052: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204440.4374888-33101-239381404421935/ /root/.ansible/tmp/ansible-tmp-1727204440.4374888-33101-239381404421935/AnsiballZ_package_facts.py && sleep 0' 32134 1727204440.76541: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204440.76548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204440.76551: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204440.76553: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204440.76555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204440.76558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204440.76606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204440.76610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204440.76655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204440.78600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204440.78651: stderr chunk (state=3): >>><<< 32134 1727204440.78655: stdout chunk (state=3): >>><<< 32134 1727204440.78671: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204440.78674: _low_level_execute_command(): starting 32134 1727204440.78683: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204440.4374888-33101-239381404421935/AnsiballZ_package_facts.py && sleep 0' 32134 1727204440.79133: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204440.79137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204440.79140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204440.79142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204440.79196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204440.79208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204440.79251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204441.43772: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 32134 1727204441.43793: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 32134 1727204441.43825: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 32134 1727204441.43848: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 32134 1727204441.43864: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 32134 1727204441.43900: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.<<< 32134 1727204441.43914: stdout chunk (state=3): >>>fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": n<<< 32134 1727204441.43939: stdout chunk (state=3): >>>ull, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 32134 1727204441.43958: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 32134 1727204441.43975: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name"<<< 32134 1727204441.43995: stdout chunk (state=3): >>>: "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 32134 1727204441.44009: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 32134 1727204441.44024: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 32134 1727204441.44048: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 32134 1727204441.45914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204441.45987: stderr chunk (state=3): >>><<< 32134 1727204441.45992: stdout chunk (state=3): >>><<< 32134 1727204441.46036: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204441.48717: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204440.4374888-33101-239381404421935/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204441.48739: _low_level_execute_command(): starting 32134 1727204441.48745: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204440.4374888-33101-239381404421935/ > /dev/null 2>&1 && sleep 0' 32134 1727204441.49256: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204441.49259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204441.49262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32134 1727204441.49265: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204441.49267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204441.49321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204441.49325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204441.49329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204441.49379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204441.51467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204441.51471: stdout chunk (state=3): >>><<< 32134 1727204441.51473: stderr chunk (state=3): >>><<< 32134 1727204441.51476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204441.51495: handler run complete 32134 1727204441.53120: variable 'ansible_facts' from source: unknown 32134 1727204441.54030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204441.62263: variable 'ansible_facts' from source: unknown 32134 1727204441.63063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204441.64543: attempt loop complete, returning result 32134 1727204441.64561: _execute() done 32134 1727204441.64564: dumping result to json 32134 1727204441.64741: done dumping result, returning 32134 1727204441.64750: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-753f-5162-000000000394] 32134 1727204441.64754: sending task result for task 12b410aa-8751-753f-5162-000000000394 32134 1727204441.68023: done sending task result for task 12b410aa-8751-753f-5162-000000000394 32134 1727204441.68027: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32134 1727204441.68131: no more pending results, returning what we have 32134 1727204441.68134: results queue empty 32134 1727204441.68135: checking for any_errors_fatal 32134 1727204441.68141: done checking for any_errors_fatal 32134 1727204441.68142: checking for max_fail_percentage 32134 1727204441.68144: done checking for max_fail_percentage 32134 1727204441.68145: checking to see if all hosts have failed and the running result is not ok 32134 1727204441.68146: done checking to see if all hosts have failed 32134 1727204441.68147: getting the remaining hosts for this loop 32134 1727204441.68148: done getting the remaining hosts for this loop 32134 1727204441.68153: getting the next task for host managed-node2 32134 1727204441.68160: done getting next task for host managed-node2 32134 1727204441.68164: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 32134 1727204441.68167: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204441.68181: getting variables 32134 1727204441.68182: in VariableManager get_vars() 32134 1727204441.68221: Calling all_inventory to load vars for managed-node2 32134 1727204441.68224: Calling groups_inventory to load vars for managed-node2 32134 1727204441.68227: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204441.68238: Calling all_plugins_play to load vars for managed-node2 32134 1727204441.68241: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204441.68245: Calling groups_plugins_play to load vars for managed-node2 32134 1727204441.70359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204441.75074: done with get_vars() 32134 1727204441.75225: done getting variables 32134 1727204441.75302: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:00:41 -0400 (0:00:01.363) 0:00:16.157 ***** 32134 1727204441.75347: entering _queue_task() for managed-node2/debug 32134 1727204441.75705: worker is 1 (out of 1 available) 32134 1727204441.75719: exiting _queue_task() for managed-node2/debug 32134 1727204441.75733: done queuing things up, now waiting for results queue to drain 32134 1727204441.75735: waiting for pending results... 32134 1727204441.76122: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 32134 1727204441.76225: in run() - task 12b410aa-8751-753f-5162-000000000018 32134 1727204441.76250: variable 'ansible_search_path' from source: unknown 32134 1727204441.76258: variable 'ansible_search_path' from source: unknown 32134 1727204441.76305: calling self._execute() 32134 1727204441.76408: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204441.76421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204441.76442: variable 'omit' from source: magic vars 32134 1727204441.76878: variable 'ansible_distribution_major_version' from source: facts 32134 1727204441.76894: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204441.76898: variable 'omit' from source: magic vars 32134 1727204441.76957: variable 'omit' from source: magic vars 32134 1727204441.77041: variable 'network_provider' from source: set_fact 32134 1727204441.77057: variable 'omit' from source: magic vars 32134 1727204441.77094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204441.77125: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204441.77147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204441.77164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204441.77175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204441.77205: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204441.77208: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204441.77216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204441.77299: Set connection var ansible_timeout to 10 32134 1727204441.77315: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204441.77318: Set connection var ansible_connection to ssh 32134 1727204441.77321: Set connection var ansible_shell_type to sh 32134 1727204441.77327: Set connection var ansible_shell_executable to /bin/sh 32134 1727204441.77334: Set connection var ansible_pipelining to False 32134 1727204441.77353: variable 'ansible_shell_executable' from source: unknown 32134 1727204441.77356: variable 'ansible_connection' from source: unknown 32134 1727204441.77359: variable 'ansible_module_compression' from source: unknown 32134 1727204441.77363: variable 'ansible_shell_type' from source: unknown 32134 1727204441.77368: variable 'ansible_shell_executable' from source: unknown 32134 1727204441.77370: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204441.77376: variable 'ansible_pipelining' from source: unknown 32134 1727204441.77379: variable 'ansible_timeout' from source: unknown 32134 1727204441.77385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204441.77504: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204441.77517: variable 'omit' from source: magic vars 32134 1727204441.77521: starting attempt loop 32134 1727204441.77525: running the handler 32134 1727204441.77566: handler run complete 32134 1727204441.77580: attempt loop complete, returning result 32134 1727204441.77583: _execute() done 32134 1727204441.77586: dumping result to json 32134 1727204441.77594: done dumping result, returning 32134 1727204441.77603: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-753f-5162-000000000018] 32134 1727204441.77606: sending task result for task 12b410aa-8751-753f-5162-000000000018 32134 1727204441.77698: done sending task result for task 12b410aa-8751-753f-5162-000000000018 32134 1727204441.77702: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 32134 1727204441.77766: no more pending results, returning what we have 32134 1727204441.77770: results queue empty 32134 1727204441.77771: checking for any_errors_fatal 32134 1727204441.77781: done checking for any_errors_fatal 32134 1727204441.77783: checking for max_fail_percentage 32134 1727204441.77784: done checking for max_fail_percentage 32134 1727204441.77785: checking to see if all hosts have failed and the running result is not ok 32134 1727204441.77786: done checking to see if all hosts have failed 32134 1727204441.77787: getting the remaining hosts for this loop 32134 1727204441.77791: done getting the remaining hosts for this loop 32134 1727204441.77796: getting the next task for host managed-node2 32134 1727204441.77802: done getting next task for host managed-node2 32134 1727204441.77807: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 32134 1727204441.77809: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204441.77824: getting variables 32134 1727204441.77826: in VariableManager get_vars() 32134 1727204441.77862: Calling all_inventory to load vars for managed-node2 32134 1727204441.77865: Calling groups_inventory to load vars for managed-node2 32134 1727204441.77867: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204441.77878: Calling all_plugins_play to load vars for managed-node2 32134 1727204441.77881: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204441.77885: Calling groups_plugins_play to load vars for managed-node2 32134 1727204441.79755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204441.81342: done with get_vars() 32134 1727204441.81369: done getting variables 32134 1727204441.81423: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:00:41 -0400 (0:00:00.061) 0:00:16.218 ***** 32134 1727204441.81452: entering _queue_task() for managed-node2/fail 32134 1727204441.81736: worker is 1 (out of 1 available) 32134 1727204441.81752: exiting _queue_task() for managed-node2/fail 32134 1727204441.81765: done queuing things up, now waiting for results queue to drain 32134 1727204441.81767: waiting for pending results... 32134 1727204441.82132: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 32134 1727204441.82147: in run() - task 12b410aa-8751-753f-5162-000000000019 32134 1727204441.82164: variable 'ansible_search_path' from source: unknown 32134 1727204441.82168: variable 'ansible_search_path' from source: unknown 32134 1727204441.82211: calling self._execute() 32134 1727204441.82316: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204441.82321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204441.82335: variable 'omit' from source: magic vars 32134 1727204441.82806: variable 'ansible_distribution_major_version' from source: facts 32134 1727204441.82819: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204441.82971: variable 'network_state' from source: role '' defaults 32134 1727204441.82996: Evaluated conditional (network_state != {}): False 32134 1727204441.83000: when evaluation is False, skipping this task 32134 1727204441.83003: _execute() done 32134 1727204441.83006: dumping result to json 32134 1727204441.83009: done dumping result, returning 32134 1727204441.83100: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-753f-5162-000000000019] 32134 1727204441.83108: sending task result for task 12b410aa-8751-753f-5162-000000000019 32134 1727204441.83180: done sending task result for task 12b410aa-8751-753f-5162-000000000019 32134 1727204441.83183: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32134 1727204441.83385: no more pending results, returning what we have 32134 1727204441.83391: results queue empty 32134 1727204441.83393: checking for any_errors_fatal 32134 1727204441.83398: done checking for any_errors_fatal 32134 1727204441.83399: checking for max_fail_percentage 32134 1727204441.83401: done checking for max_fail_percentage 32134 1727204441.83402: checking to see if all hosts have failed and the running result is not ok 32134 1727204441.83403: done checking to see if all hosts have failed 32134 1727204441.83404: getting the remaining hosts for this loop 32134 1727204441.83406: done getting the remaining hosts for this loop 32134 1727204441.83410: getting the next task for host managed-node2 32134 1727204441.83416: done getting next task for host managed-node2 32134 1727204441.83420: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 32134 1727204441.83424: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204441.83439: getting variables 32134 1727204441.83441: in VariableManager get_vars() 32134 1727204441.83478: Calling all_inventory to load vars for managed-node2 32134 1727204441.83482: Calling groups_inventory to load vars for managed-node2 32134 1727204441.83485: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204441.83498: Calling all_plugins_play to load vars for managed-node2 32134 1727204441.83501: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204441.83506: Calling groups_plugins_play to load vars for managed-node2 32134 1727204441.85649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204441.88592: done with get_vars() 32134 1727204441.88635: done getting variables 32134 1727204441.88719: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:00:41 -0400 (0:00:00.073) 0:00:16.291 ***** 32134 1727204441.88760: entering _queue_task() for managed-node2/fail 32134 1727204441.89139: worker is 1 (out of 1 available) 32134 1727204441.89154: exiting _queue_task() for managed-node2/fail 32134 1727204441.89168: done queuing things up, now waiting for results queue to drain 32134 1727204441.89170: waiting for pending results... 32134 1727204441.89542: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 32134 1727204441.89708: in run() - task 12b410aa-8751-753f-5162-00000000001a 32134 1727204441.89712: variable 'ansible_search_path' from source: unknown 32134 1727204441.89715: variable 'ansible_search_path' from source: unknown 32134 1727204441.89717: calling self._execute() 32134 1727204441.89954: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204441.89973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204441.90151: variable 'omit' from source: magic vars 32134 1727204441.90554: variable 'ansible_distribution_major_version' from source: facts 32134 1727204441.90670: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204441.90956: variable 'network_state' from source: role '' defaults 32134 1727204441.90975: Evaluated conditional (network_state != {}): False 32134 1727204441.90983: when evaluation is False, skipping this task 32134 1727204441.90994: _execute() done 32134 1727204441.91004: dumping result to json 32134 1727204441.91014: done dumping result, returning 32134 1727204441.91077: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-753f-5162-00000000001a] 32134 1727204441.91092: sending task result for task 12b410aa-8751-753f-5162-00000000001a skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32134 1727204441.91295: no more pending results, returning what we have 32134 1727204441.91300: results queue empty 32134 1727204441.91301: checking for any_errors_fatal 32134 1727204441.91310: done checking for any_errors_fatal 32134 1727204441.91311: checking for max_fail_percentage 32134 1727204441.91312: done checking for max_fail_percentage 32134 1727204441.91313: checking to see if all hosts have failed and the running result is not ok 32134 1727204441.91315: done checking to see if all hosts have failed 32134 1727204441.91316: getting the remaining hosts for this loop 32134 1727204441.91318: done getting the remaining hosts for this loop 32134 1727204441.91323: getting the next task for host managed-node2 32134 1727204441.91331: done getting next task for host managed-node2 32134 1727204441.91335: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 32134 1727204441.91340: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204441.91359: getting variables 32134 1727204441.91361: in VariableManager get_vars() 32134 1727204441.91512: Calling all_inventory to load vars for managed-node2 32134 1727204441.91516: Calling groups_inventory to load vars for managed-node2 32134 1727204441.91520: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204441.91527: done sending task result for task 12b410aa-8751-753f-5162-00000000001a 32134 1727204441.91531: WORKER PROCESS EXITING 32134 1727204441.91547: Calling all_plugins_play to load vars for managed-node2 32134 1727204441.91551: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204441.91556: Calling groups_plugins_play to load vars for managed-node2 32134 1727204441.95178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204441.98663: done with get_vars() 32134 1727204441.98711: done getting variables 32134 1727204441.98787: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:00:41 -0400 (0:00:00.100) 0:00:16.392 ***** 32134 1727204441.98830: entering _queue_task() for managed-node2/fail 32134 1727204441.99180: worker is 1 (out of 1 available) 32134 1727204441.99195: exiting _queue_task() for managed-node2/fail 32134 1727204441.99208: done queuing things up, now waiting for results queue to drain 32134 1727204441.99210: waiting for pending results... 32134 1727204441.99610: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 32134 1727204441.99669: in run() - task 12b410aa-8751-753f-5162-00000000001b 32134 1727204441.99696: variable 'ansible_search_path' from source: unknown 32134 1727204441.99710: variable 'ansible_search_path' from source: unknown 32134 1727204441.99756: calling self._execute() 32134 1727204441.99974: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204441.99988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204442.00006: variable 'omit' from source: magic vars 32134 1727204442.00631: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.00653: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204442.00887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204442.04261: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204442.04367: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204442.04434: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204442.04470: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204442.04543: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204442.04614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.04662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.04702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.04765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.04870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.04916: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.04940: Evaluated conditional (ansible_distribution_major_version | int > 9): True 32134 1727204442.05100: variable 'ansible_distribution' from source: facts 32134 1727204442.05110: variable '__network_rh_distros' from source: role '' defaults 32134 1727204442.05126: Evaluated conditional (ansible_distribution in __network_rh_distros): False 32134 1727204442.05133: when evaluation is False, skipping this task 32134 1727204442.05141: _execute() done 32134 1727204442.05149: dumping result to json 32134 1727204442.05158: done dumping result, returning 32134 1727204442.05170: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-753f-5162-00000000001b] 32134 1727204442.05178: sending task result for task 12b410aa-8751-753f-5162-00000000001b skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 32134 1727204442.05356: no more pending results, returning what we have 32134 1727204442.05362: results queue empty 32134 1727204442.05363: checking for any_errors_fatal 32134 1727204442.05370: done checking for any_errors_fatal 32134 1727204442.05371: checking for max_fail_percentage 32134 1727204442.05373: done checking for max_fail_percentage 32134 1727204442.05374: checking to see if all hosts have failed and the running result is not ok 32134 1727204442.05375: done checking to see if all hosts have failed 32134 1727204442.05376: getting the remaining hosts for this loop 32134 1727204442.05377: done getting the remaining hosts for this loop 32134 1727204442.05382: getting the next task for host managed-node2 32134 1727204442.05393: done getting next task for host managed-node2 32134 1727204442.05398: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 32134 1727204442.05401: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204442.05419: getting variables 32134 1727204442.05422: in VariableManager get_vars() 32134 1727204442.05467: Calling all_inventory to load vars for managed-node2 32134 1727204442.05471: Calling groups_inventory to load vars for managed-node2 32134 1727204442.05474: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204442.05487: Calling all_plugins_play to load vars for managed-node2 32134 1727204442.05597: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204442.05603: done sending task result for task 12b410aa-8751-753f-5162-00000000001b 32134 1727204442.05607: WORKER PROCESS EXITING 32134 1727204442.05611: Calling groups_plugins_play to load vars for managed-node2 32134 1727204442.08148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204442.13176: done with get_vars() 32134 1727204442.13225: done getting variables 32134 1727204442.13346: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:00:42 -0400 (0:00:00.145) 0:00:16.537 ***** 32134 1727204442.13384: entering _queue_task() for managed-node2/dnf 32134 1727204442.13741: worker is 1 (out of 1 available) 32134 1727204442.13756: exiting _queue_task() for managed-node2/dnf 32134 1727204442.13770: done queuing things up, now waiting for results queue to drain 32134 1727204442.13772: waiting for pending results... 32134 1727204442.14046: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 32134 1727204442.14200: in run() - task 12b410aa-8751-753f-5162-00000000001c 32134 1727204442.14220: variable 'ansible_search_path' from source: unknown 32134 1727204442.14224: variable 'ansible_search_path' from source: unknown 32134 1727204442.14273: calling self._execute() 32134 1727204442.14380: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204442.14434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204442.14438: variable 'omit' from source: magic vars 32134 1727204442.14857: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.14869: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204442.15154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204442.17971: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204442.18195: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204442.18199: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204442.18201: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204442.18204: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204442.18295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.18346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.18377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.18442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.18460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.18611: variable 'ansible_distribution' from source: facts 32134 1727204442.18620: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.18629: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 32134 1727204442.18793: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204442.18991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.19024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.19081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.19114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.19135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.19191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.19297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.19301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.19320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.19338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.19390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.19431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.19462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.19515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.19542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.19781: variable 'network_connections' from source: task vars 32134 1727204442.19797: variable 'interface' from source: set_fact 32134 1727204442.19951: variable 'interface' from source: set_fact 32134 1727204442.19955: variable 'interface' from source: set_fact 32134 1727204442.19991: variable 'interface' from source: set_fact 32134 1727204442.20091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204442.20310: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204442.20357: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204442.20420: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204442.20455: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204442.20594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204442.20598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204442.20611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.20614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204442.20690: variable '__network_team_connections_defined' from source: role '' defaults 32134 1727204442.21048: variable 'network_connections' from source: task vars 32134 1727204442.21059: variable 'interface' from source: set_fact 32134 1727204442.21165: variable 'interface' from source: set_fact 32134 1727204442.21169: variable 'interface' from source: set_fact 32134 1727204442.21224: variable 'interface' from source: set_fact 32134 1727204442.21260: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32134 1727204442.21272: when evaluation is False, skipping this task 32134 1727204442.21278: _execute() done 32134 1727204442.21281: dumping result to json 32134 1727204442.21380: done dumping result, returning 32134 1727204442.21383: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-753f-5162-00000000001c] 32134 1727204442.21385: sending task result for task 12b410aa-8751-753f-5162-00000000001c 32134 1727204442.21455: done sending task result for task 12b410aa-8751-753f-5162-00000000001c 32134 1727204442.21458: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32134 1727204442.21516: no more pending results, returning what we have 32134 1727204442.21520: results queue empty 32134 1727204442.21521: checking for any_errors_fatal 32134 1727204442.21529: done checking for any_errors_fatal 32134 1727204442.21530: checking for max_fail_percentage 32134 1727204442.21532: done checking for max_fail_percentage 32134 1727204442.21533: checking to see if all hosts have failed and the running result is not ok 32134 1727204442.21534: done checking to see if all hosts have failed 32134 1727204442.21535: getting the remaining hosts for this loop 32134 1727204442.21537: done getting the remaining hosts for this loop 32134 1727204442.21541: getting the next task for host managed-node2 32134 1727204442.21549: done getting next task for host managed-node2 32134 1727204442.21554: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 32134 1727204442.21557: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204442.21573: getting variables 32134 1727204442.21574: in VariableManager get_vars() 32134 1727204442.21616: Calling all_inventory to load vars for managed-node2 32134 1727204442.21619: Calling groups_inventory to load vars for managed-node2 32134 1727204442.21622: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204442.21632: Calling all_plugins_play to load vars for managed-node2 32134 1727204442.21635: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204442.21639: Calling groups_plugins_play to load vars for managed-node2 32134 1727204442.26238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204442.29520: done with get_vars() 32134 1727204442.29771: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 32134 1727204442.29867: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:00:42 -0400 (0:00:00.165) 0:00:16.703 ***** 32134 1727204442.29908: entering _queue_task() for managed-node2/yum 32134 1727204442.29910: Creating lock for yum 32134 1727204442.30608: worker is 1 (out of 1 available) 32134 1727204442.30619: exiting _queue_task() for managed-node2/yum 32134 1727204442.30630: done queuing things up, now waiting for results queue to drain 32134 1727204442.30632: waiting for pending results... 32134 1727204442.30909: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 32134 1727204442.30916: in run() - task 12b410aa-8751-753f-5162-00000000001d 32134 1727204442.30926: variable 'ansible_search_path' from source: unknown 32134 1727204442.30936: variable 'ansible_search_path' from source: unknown 32134 1727204442.31033: calling self._execute() 32134 1727204442.31144: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204442.31160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204442.31176: variable 'omit' from source: magic vars 32134 1727204442.31952: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.31984: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204442.32452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204442.36689: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204442.36808: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204442.36851: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204442.42079: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204442.42106: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204442.42169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.42193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.42223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.42256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.42269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.42354: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.42366: Evaluated conditional (ansible_distribution_major_version | int < 8): False 32134 1727204442.42370: when evaluation is False, skipping this task 32134 1727204442.42373: _execute() done 32134 1727204442.42376: dumping result to json 32134 1727204442.42381: done dumping result, returning 32134 1727204442.42388: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-753f-5162-00000000001d] 32134 1727204442.42394: sending task result for task 12b410aa-8751-753f-5162-00000000001d 32134 1727204442.42492: done sending task result for task 12b410aa-8751-753f-5162-00000000001d 32134 1727204442.42496: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 32134 1727204442.42550: no more pending results, returning what we have 32134 1727204442.42554: results queue empty 32134 1727204442.42555: checking for any_errors_fatal 32134 1727204442.42562: done checking for any_errors_fatal 32134 1727204442.42562: checking for max_fail_percentage 32134 1727204442.42564: done checking for max_fail_percentage 32134 1727204442.42565: checking to see if all hosts have failed and the running result is not ok 32134 1727204442.42566: done checking to see if all hosts have failed 32134 1727204442.42567: getting the remaining hosts for this loop 32134 1727204442.42568: done getting the remaining hosts for this loop 32134 1727204442.42572: getting the next task for host managed-node2 32134 1727204442.42579: done getting next task for host managed-node2 32134 1727204442.42583: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 32134 1727204442.42587: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204442.42604: getting variables 32134 1727204442.42612: in VariableManager get_vars() 32134 1727204442.42654: Calling all_inventory to load vars for managed-node2 32134 1727204442.42657: Calling groups_inventory to load vars for managed-node2 32134 1727204442.42660: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204442.42670: Calling all_plugins_play to load vars for managed-node2 32134 1727204442.42673: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204442.42677: Calling groups_plugins_play to load vars for managed-node2 32134 1727204442.47120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204442.48677: done with get_vars() 32134 1727204442.48702: done getting variables 32134 1727204442.48746: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:00:42 -0400 (0:00:00.188) 0:00:16.891 ***** 32134 1727204442.48769: entering _queue_task() for managed-node2/fail 32134 1727204442.49044: worker is 1 (out of 1 available) 32134 1727204442.49061: exiting _queue_task() for managed-node2/fail 32134 1727204442.49074: done queuing things up, now waiting for results queue to drain 32134 1727204442.49076: waiting for pending results... 32134 1727204442.49269: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 32134 1727204442.49379: in run() - task 12b410aa-8751-753f-5162-00000000001e 32134 1727204442.49393: variable 'ansible_search_path' from source: unknown 32134 1727204442.49397: variable 'ansible_search_path' from source: unknown 32134 1727204442.49435: calling self._execute() 32134 1727204442.49514: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204442.49525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204442.49537: variable 'omit' from source: magic vars 32134 1727204442.49862: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.49875: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204442.49983: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204442.50155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204442.51927: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204442.51993: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204442.52030: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204442.52061: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204442.52084: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204442.52155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.52182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.52206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.52241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.52254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.52299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.52321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.52341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.52376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.52392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.52429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.52449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.52470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.52505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.52521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.52663: variable 'network_connections' from source: task vars 32134 1727204442.52674: variable 'interface' from source: set_fact 32134 1727204442.52744: variable 'interface' from source: set_fact 32134 1727204442.52752: variable 'interface' from source: set_fact 32134 1727204442.52805: variable 'interface' from source: set_fact 32134 1727204442.52868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204442.53013: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204442.53049: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204442.53077: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204442.53103: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204442.53144: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204442.53166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204442.53191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.53212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204442.53267: variable '__network_team_connections_defined' from source: role '' defaults 32134 1727204442.53469: variable 'network_connections' from source: task vars 32134 1727204442.53479: variable 'interface' from source: set_fact 32134 1727204442.53531: variable 'interface' from source: set_fact 32134 1727204442.53538: variable 'interface' from source: set_fact 32134 1727204442.53590: variable 'interface' from source: set_fact 32134 1727204442.53621: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32134 1727204442.53624: when evaluation is False, skipping this task 32134 1727204442.53627: _execute() done 32134 1727204442.53632: dumping result to json 32134 1727204442.53636: done dumping result, returning 32134 1727204442.53645: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-753f-5162-00000000001e] 32134 1727204442.53658: sending task result for task 12b410aa-8751-753f-5162-00000000001e 32134 1727204442.53746: done sending task result for task 12b410aa-8751-753f-5162-00000000001e 32134 1727204442.53748: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32134 1727204442.53831: no more pending results, returning what we have 32134 1727204442.53835: results queue empty 32134 1727204442.53836: checking for any_errors_fatal 32134 1727204442.53844: done checking for any_errors_fatal 32134 1727204442.53845: checking for max_fail_percentage 32134 1727204442.53847: done checking for max_fail_percentage 32134 1727204442.53848: checking to see if all hosts have failed and the running result is not ok 32134 1727204442.53849: done checking to see if all hosts have failed 32134 1727204442.53850: getting the remaining hosts for this loop 32134 1727204442.53851: done getting the remaining hosts for this loop 32134 1727204442.53855: getting the next task for host managed-node2 32134 1727204442.53870: done getting next task for host managed-node2 32134 1727204442.53875: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 32134 1727204442.53878: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204442.53896: getting variables 32134 1727204442.53898: in VariableManager get_vars() 32134 1727204442.53936: Calling all_inventory to load vars for managed-node2 32134 1727204442.53939: Calling groups_inventory to load vars for managed-node2 32134 1727204442.53941: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204442.53952: Calling all_plugins_play to load vars for managed-node2 32134 1727204442.53955: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204442.53958: Calling groups_plugins_play to load vars for managed-node2 32134 1727204442.55201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204442.56910: done with get_vars() 32134 1727204442.56935: done getting variables 32134 1727204442.56986: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:00:42 -0400 (0:00:00.082) 0:00:16.974 ***** 32134 1727204442.57022: entering _queue_task() for managed-node2/package 32134 1727204442.57296: worker is 1 (out of 1 available) 32134 1727204442.57311: exiting _queue_task() for managed-node2/package 32134 1727204442.57325: done queuing things up, now waiting for results queue to drain 32134 1727204442.57327: waiting for pending results... 32134 1727204442.57512: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 32134 1727204442.57616: in run() - task 12b410aa-8751-753f-5162-00000000001f 32134 1727204442.57631: variable 'ansible_search_path' from source: unknown 32134 1727204442.57635: variable 'ansible_search_path' from source: unknown 32134 1727204442.57671: calling self._execute() 32134 1727204442.57753: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204442.57760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204442.57774: variable 'omit' from source: magic vars 32134 1727204442.58099: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.58113: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204442.58281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204442.58510: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204442.58553: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204442.58608: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204442.58641: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204442.58745: variable 'network_packages' from source: role '' defaults 32134 1727204442.58834: variable '__network_provider_setup' from source: role '' defaults 32134 1727204442.58845: variable '__network_service_name_default_nm' from source: role '' defaults 32134 1727204442.58912: variable '__network_service_name_default_nm' from source: role '' defaults 32134 1727204442.58922: variable '__network_packages_default_nm' from source: role '' defaults 32134 1727204442.58973: variable '__network_packages_default_nm' from source: role '' defaults 32134 1727204442.59135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204442.60715: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204442.60770: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204442.60802: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204442.60837: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204442.60859: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204442.60944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.60966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.60987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.61025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.61039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.61080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.61102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.61125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.61162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.61175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.61359: variable '__network_packages_default_gobject_packages' from source: role '' defaults 32134 1727204442.61454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.61475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.61501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.61535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.61548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.61628: variable 'ansible_python' from source: facts 32134 1727204442.61650: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 32134 1727204442.61723: variable '__network_wpa_supplicant_required' from source: role '' defaults 32134 1727204442.61788: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32134 1727204442.61902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.61927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.61949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.61980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.61993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.62039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.62063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.62083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.62118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.62132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.62259: variable 'network_connections' from source: task vars 32134 1727204442.62266: variable 'interface' from source: set_fact 32134 1727204442.62356: variable 'interface' from source: set_fact 32134 1727204442.62365: variable 'interface' from source: set_fact 32134 1727204442.62451: variable 'interface' from source: set_fact 32134 1727204442.62512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204442.62537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204442.62563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.62594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204442.62637: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204442.62878: variable 'network_connections' from source: task vars 32134 1727204442.62881: variable 'interface' from source: set_fact 32134 1727204442.62970: variable 'interface' from source: set_fact 32134 1727204442.62978: variable 'interface' from source: set_fact 32134 1727204442.63065: variable 'interface' from source: set_fact 32134 1727204442.63112: variable '__network_packages_default_wireless' from source: role '' defaults 32134 1727204442.63183: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204442.63446: variable 'network_connections' from source: task vars 32134 1727204442.63451: variable 'interface' from source: set_fact 32134 1727204442.63506: variable 'interface' from source: set_fact 32134 1727204442.63512: variable 'interface' from source: set_fact 32134 1727204442.63570: variable 'interface' from source: set_fact 32134 1727204442.63592: variable '__network_packages_default_team' from source: role '' defaults 32134 1727204442.63659: variable '__network_team_connections_defined' from source: role '' defaults 32134 1727204442.63919: variable 'network_connections' from source: task vars 32134 1727204442.63923: variable 'interface' from source: set_fact 32134 1727204442.63978: variable 'interface' from source: set_fact 32134 1727204442.63985: variable 'interface' from source: set_fact 32134 1727204442.64046: variable 'interface' from source: set_fact 32134 1727204442.64097: variable '__network_service_name_default_initscripts' from source: role '' defaults 32134 1727204442.64152: variable '__network_service_name_default_initscripts' from source: role '' defaults 32134 1727204442.64159: variable '__network_packages_default_initscripts' from source: role '' defaults 32134 1727204442.64211: variable '__network_packages_default_initscripts' from source: role '' defaults 32134 1727204442.64401: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 32134 1727204442.64826: variable 'network_connections' from source: task vars 32134 1727204442.64830: variable 'interface' from source: set_fact 32134 1727204442.64886: variable 'interface' from source: set_fact 32134 1727204442.64891: variable 'interface' from source: set_fact 32134 1727204442.64943: variable 'interface' from source: set_fact 32134 1727204442.64952: variable 'ansible_distribution' from source: facts 32134 1727204442.64955: variable '__network_rh_distros' from source: role '' defaults 32134 1727204442.64963: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.64983: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 32134 1727204442.65129: variable 'ansible_distribution' from source: facts 32134 1727204442.65132: variable '__network_rh_distros' from source: role '' defaults 32134 1727204442.65139: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.65146: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 32134 1727204442.65284: variable 'ansible_distribution' from source: facts 32134 1727204442.65288: variable '__network_rh_distros' from source: role '' defaults 32134 1727204442.65295: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.65331: variable 'network_provider' from source: set_fact 32134 1727204442.65344: variable 'ansible_facts' from source: unknown 32134 1727204442.66027: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 32134 1727204442.66031: when evaluation is False, skipping this task 32134 1727204442.66034: _execute() done 32134 1727204442.66037: dumping result to json 32134 1727204442.66042: done dumping result, returning 32134 1727204442.66051: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-753f-5162-00000000001f] 32134 1727204442.66056: sending task result for task 12b410aa-8751-753f-5162-00000000001f 32134 1727204442.66156: done sending task result for task 12b410aa-8751-753f-5162-00000000001f 32134 1727204442.66159: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 32134 1727204442.66225: no more pending results, returning what we have 32134 1727204442.66229: results queue empty 32134 1727204442.66230: checking for any_errors_fatal 32134 1727204442.66240: done checking for any_errors_fatal 32134 1727204442.66241: checking for max_fail_percentage 32134 1727204442.66242: done checking for max_fail_percentage 32134 1727204442.66243: checking to see if all hosts have failed and the running result is not ok 32134 1727204442.66244: done checking to see if all hosts have failed 32134 1727204442.66245: getting the remaining hosts for this loop 32134 1727204442.66247: done getting the remaining hosts for this loop 32134 1727204442.66251: getting the next task for host managed-node2 32134 1727204442.66259: done getting next task for host managed-node2 32134 1727204442.66264: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 32134 1727204442.66266: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204442.66287: getting variables 32134 1727204442.66291: in VariableManager get_vars() 32134 1727204442.66334: Calling all_inventory to load vars for managed-node2 32134 1727204442.66337: Calling groups_inventory to load vars for managed-node2 32134 1727204442.66340: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204442.66351: Calling all_plugins_play to load vars for managed-node2 32134 1727204442.66354: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204442.66357: Calling groups_plugins_play to load vars for managed-node2 32134 1727204442.67675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204442.69269: done with get_vars() 32134 1727204442.69297: done getting variables 32134 1727204442.69352: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:00:42 -0400 (0:00:00.123) 0:00:17.097 ***** 32134 1727204442.69379: entering _queue_task() for managed-node2/package 32134 1727204442.69643: worker is 1 (out of 1 available) 32134 1727204442.69657: exiting _queue_task() for managed-node2/package 32134 1727204442.69671: done queuing things up, now waiting for results queue to drain 32134 1727204442.69673: waiting for pending results... 32134 1727204442.69858: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 32134 1727204442.69961: in run() - task 12b410aa-8751-753f-5162-000000000020 32134 1727204442.69975: variable 'ansible_search_path' from source: unknown 32134 1727204442.69979: variable 'ansible_search_path' from source: unknown 32134 1727204442.70017: calling self._execute() 32134 1727204442.70094: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204442.70100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204442.70114: variable 'omit' from source: magic vars 32134 1727204442.70432: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.70444: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204442.70549: variable 'network_state' from source: role '' defaults 32134 1727204442.70559: Evaluated conditional (network_state != {}): False 32134 1727204442.70566: when evaluation is False, skipping this task 32134 1727204442.70573: _execute() done 32134 1727204442.70576: dumping result to json 32134 1727204442.70582: done dumping result, returning 32134 1727204442.70592: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-753f-5162-000000000020] 32134 1727204442.70598: sending task result for task 12b410aa-8751-753f-5162-000000000020 32134 1727204442.70701: done sending task result for task 12b410aa-8751-753f-5162-000000000020 32134 1727204442.70704: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32134 1727204442.70763: no more pending results, returning what we have 32134 1727204442.70768: results queue empty 32134 1727204442.70769: checking for any_errors_fatal 32134 1727204442.70778: done checking for any_errors_fatal 32134 1727204442.70779: checking for max_fail_percentage 32134 1727204442.70780: done checking for max_fail_percentage 32134 1727204442.70782: checking to see if all hosts have failed and the running result is not ok 32134 1727204442.70783: done checking to see if all hosts have failed 32134 1727204442.70784: getting the remaining hosts for this loop 32134 1727204442.70785: done getting the remaining hosts for this loop 32134 1727204442.70791: getting the next task for host managed-node2 32134 1727204442.70798: done getting next task for host managed-node2 32134 1727204442.70802: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 32134 1727204442.70806: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204442.70829: getting variables 32134 1727204442.70830: in VariableManager get_vars() 32134 1727204442.70866: Calling all_inventory to load vars for managed-node2 32134 1727204442.70869: Calling groups_inventory to load vars for managed-node2 32134 1727204442.70871: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204442.70882: Calling all_plugins_play to load vars for managed-node2 32134 1727204442.70886: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204442.70891: Calling groups_plugins_play to load vars for managed-node2 32134 1727204442.72205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204442.73766: done with get_vars() 32134 1727204442.73788: done getting variables 32134 1727204442.73839: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:00:42 -0400 (0:00:00.044) 0:00:17.142 ***** 32134 1727204442.73867: entering _queue_task() for managed-node2/package 32134 1727204442.74105: worker is 1 (out of 1 available) 32134 1727204442.74121: exiting _queue_task() for managed-node2/package 32134 1727204442.74135: done queuing things up, now waiting for results queue to drain 32134 1727204442.74137: waiting for pending results... 32134 1727204442.74326: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 32134 1727204442.74429: in run() - task 12b410aa-8751-753f-5162-000000000021 32134 1727204442.74440: variable 'ansible_search_path' from source: unknown 32134 1727204442.74444: variable 'ansible_search_path' from source: unknown 32134 1727204442.74479: calling self._execute() 32134 1727204442.74559: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204442.74566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204442.74578: variable 'omit' from source: magic vars 32134 1727204442.74897: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.74908: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204442.75015: variable 'network_state' from source: role '' defaults 32134 1727204442.75029: Evaluated conditional (network_state != {}): False 32134 1727204442.75034: when evaluation is False, skipping this task 32134 1727204442.75037: _execute() done 32134 1727204442.75039: dumping result to json 32134 1727204442.75048: done dumping result, returning 32134 1727204442.75054: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-753f-5162-000000000021] 32134 1727204442.75061: sending task result for task 12b410aa-8751-753f-5162-000000000021 32134 1727204442.75159: done sending task result for task 12b410aa-8751-753f-5162-000000000021 32134 1727204442.75162: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32134 1727204442.75219: no more pending results, returning what we have 32134 1727204442.75222: results queue empty 32134 1727204442.75223: checking for any_errors_fatal 32134 1727204442.75232: done checking for any_errors_fatal 32134 1727204442.75233: checking for max_fail_percentage 32134 1727204442.75235: done checking for max_fail_percentage 32134 1727204442.75236: checking to see if all hosts have failed and the running result is not ok 32134 1727204442.75237: done checking to see if all hosts have failed 32134 1727204442.75238: getting the remaining hosts for this loop 32134 1727204442.75239: done getting the remaining hosts for this loop 32134 1727204442.75244: getting the next task for host managed-node2 32134 1727204442.75250: done getting next task for host managed-node2 32134 1727204442.75254: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 32134 1727204442.75257: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204442.75274: getting variables 32134 1727204442.75276: in VariableManager get_vars() 32134 1727204442.75312: Calling all_inventory to load vars for managed-node2 32134 1727204442.75316: Calling groups_inventory to load vars for managed-node2 32134 1727204442.75318: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204442.75328: Calling all_plugins_play to load vars for managed-node2 32134 1727204442.75331: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204442.75335: Calling groups_plugins_play to load vars for managed-node2 32134 1727204442.76505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204442.78171: done with get_vars() 32134 1727204442.78195: done getting variables 32134 1727204442.78275: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:00:42 -0400 (0:00:00.044) 0:00:17.187 ***** 32134 1727204442.78307: entering _queue_task() for managed-node2/service 32134 1727204442.78308: Creating lock for service 32134 1727204442.78552: worker is 1 (out of 1 available) 32134 1727204442.78567: exiting _queue_task() for managed-node2/service 32134 1727204442.78581: done queuing things up, now waiting for results queue to drain 32134 1727204442.78583: waiting for pending results... 32134 1727204442.78770: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 32134 1727204442.78877: in run() - task 12b410aa-8751-753f-5162-000000000022 32134 1727204442.78891: variable 'ansible_search_path' from source: unknown 32134 1727204442.78895: variable 'ansible_search_path' from source: unknown 32134 1727204442.78932: calling self._execute() 32134 1727204442.79010: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204442.79021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204442.79037: variable 'omit' from source: magic vars 32134 1727204442.79344: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.79355: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204442.79461: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204442.79634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204442.81371: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204442.81436: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204442.81468: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204442.81500: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204442.81526: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204442.81595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.81623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.81646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.81682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.81696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.81740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.81761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.81785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.81821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.81834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.81870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.81894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.81915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.81948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.81960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.82107: variable 'network_connections' from source: task vars 32134 1727204442.82123: variable 'interface' from source: set_fact 32134 1727204442.82182: variable 'interface' from source: set_fact 32134 1727204442.82189: variable 'interface' from source: set_fact 32134 1727204442.82245: variable 'interface' from source: set_fact 32134 1727204442.82304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204442.82446: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204442.82478: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204442.82506: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204442.82538: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204442.82572: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204442.82592: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204442.82613: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.82640: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204442.82691: variable '__network_team_connections_defined' from source: role '' defaults 32134 1727204442.82893: variable 'network_connections' from source: task vars 32134 1727204442.82897: variable 'interface' from source: set_fact 32134 1727204442.82950: variable 'interface' from source: set_fact 32134 1727204442.82957: variable 'interface' from source: set_fact 32134 1727204442.83011: variable 'interface' from source: set_fact 32134 1727204442.83039: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32134 1727204442.83043: when evaluation is False, skipping this task 32134 1727204442.83046: _execute() done 32134 1727204442.83051: dumping result to json 32134 1727204442.83054: done dumping result, returning 32134 1727204442.83063: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-753f-5162-000000000022] 32134 1727204442.83074: sending task result for task 12b410aa-8751-753f-5162-000000000022 32134 1727204442.83165: done sending task result for task 12b410aa-8751-753f-5162-000000000022 32134 1727204442.83168: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32134 1727204442.83220: no more pending results, returning what we have 32134 1727204442.83224: results queue empty 32134 1727204442.83225: checking for any_errors_fatal 32134 1727204442.83233: done checking for any_errors_fatal 32134 1727204442.83233: checking for max_fail_percentage 32134 1727204442.83235: done checking for max_fail_percentage 32134 1727204442.83236: checking to see if all hosts have failed and the running result is not ok 32134 1727204442.83237: done checking to see if all hosts have failed 32134 1727204442.83238: getting the remaining hosts for this loop 32134 1727204442.83240: done getting the remaining hosts for this loop 32134 1727204442.83244: getting the next task for host managed-node2 32134 1727204442.83252: done getting next task for host managed-node2 32134 1727204442.83256: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 32134 1727204442.83259: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204442.83276: getting variables 32134 1727204442.83277: in VariableManager get_vars() 32134 1727204442.83318: Calling all_inventory to load vars for managed-node2 32134 1727204442.83322: Calling groups_inventory to load vars for managed-node2 32134 1727204442.83325: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204442.83334: Calling all_plugins_play to load vars for managed-node2 32134 1727204442.83338: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204442.83341: Calling groups_plugins_play to load vars for managed-node2 32134 1727204442.84564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204442.86139: done with get_vars() 32134 1727204442.86161: done getting variables 32134 1727204442.86211: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:00:42 -0400 (0:00:00.079) 0:00:17.266 ***** 32134 1727204442.86237: entering _queue_task() for managed-node2/service 32134 1727204442.86468: worker is 1 (out of 1 available) 32134 1727204442.86484: exiting _queue_task() for managed-node2/service 32134 1727204442.86499: done queuing things up, now waiting for results queue to drain 32134 1727204442.86501: waiting for pending results... 32134 1727204442.86688: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 32134 1727204442.86802: in run() - task 12b410aa-8751-753f-5162-000000000023 32134 1727204442.86818: variable 'ansible_search_path' from source: unknown 32134 1727204442.86821: variable 'ansible_search_path' from source: unknown 32134 1727204442.86856: calling self._execute() 32134 1727204442.86933: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204442.86942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204442.86953: variable 'omit' from source: magic vars 32134 1727204442.87257: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.87267: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204442.87408: variable 'network_provider' from source: set_fact 32134 1727204442.87414: variable 'network_state' from source: role '' defaults 32134 1727204442.87424: Evaluated conditional (network_provider == "nm" or network_state != {}): True 32134 1727204442.87430: variable 'omit' from source: magic vars 32134 1727204442.87479: variable 'omit' from source: magic vars 32134 1727204442.87510: variable 'network_service_name' from source: role '' defaults 32134 1727204442.87570: variable 'network_service_name' from source: role '' defaults 32134 1727204442.87662: variable '__network_provider_setup' from source: role '' defaults 32134 1727204442.87668: variable '__network_service_name_default_nm' from source: role '' defaults 32134 1727204442.87726: variable '__network_service_name_default_nm' from source: role '' defaults 32134 1727204442.87733: variable '__network_packages_default_nm' from source: role '' defaults 32134 1727204442.87784: variable '__network_packages_default_nm' from source: role '' defaults 32134 1727204442.87981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204442.89911: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204442.89962: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204442.90002: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204442.90036: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204442.90059: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204442.90130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.90156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.90177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.90210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.90224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.90267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.90286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.90308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.90348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.90457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.90545: variable '__network_packages_default_gobject_packages' from source: role '' defaults 32134 1727204442.90644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.90666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.90690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.90723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.90735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.90814: variable 'ansible_python' from source: facts 32134 1727204442.90832: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 32134 1727204442.90900: variable '__network_wpa_supplicant_required' from source: role '' defaults 32134 1727204442.90963: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32134 1727204442.91070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.91092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.91119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.91150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.91163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.91204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204442.91231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204442.91254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.91285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204442.91299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204442.91417: variable 'network_connections' from source: task vars 32134 1727204442.91420: variable 'interface' from source: set_fact 32134 1727204442.91485: variable 'interface' from source: set_fact 32134 1727204442.91497: variable 'interface' from source: set_fact 32134 1727204442.91558: variable 'interface' from source: set_fact 32134 1727204442.91644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204442.91800: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204442.91842: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204442.91878: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204442.91926: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204442.91976: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204442.92014: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204442.92039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204442.92066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204442.92113: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204442.92344: variable 'network_connections' from source: task vars 32134 1727204442.92351: variable 'interface' from source: set_fact 32134 1727204442.92413: variable 'interface' from source: set_fact 32134 1727204442.92425: variable 'interface' from source: set_fact 32134 1727204442.92488: variable 'interface' from source: set_fact 32134 1727204442.92531: variable '__network_packages_default_wireless' from source: role '' defaults 32134 1727204442.92602: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204442.92845: variable 'network_connections' from source: task vars 32134 1727204442.92848: variable 'interface' from source: set_fact 32134 1727204442.92912: variable 'interface' from source: set_fact 32134 1727204442.92921: variable 'interface' from source: set_fact 32134 1727204442.92979: variable 'interface' from source: set_fact 32134 1727204442.93004: variable '__network_packages_default_team' from source: role '' defaults 32134 1727204442.93069: variable '__network_team_connections_defined' from source: role '' defaults 32134 1727204442.93319: variable 'network_connections' from source: task vars 32134 1727204442.93324: variable 'interface' from source: set_fact 32134 1727204442.93381: variable 'interface' from source: set_fact 32134 1727204442.93387: variable 'interface' from source: set_fact 32134 1727204442.93451: variable 'interface' from source: set_fact 32134 1727204442.93503: variable '__network_service_name_default_initscripts' from source: role '' defaults 32134 1727204442.93558: variable '__network_service_name_default_initscripts' from source: role '' defaults 32134 1727204442.93565: variable '__network_packages_default_initscripts' from source: role '' defaults 32134 1727204442.93619: variable '__network_packages_default_initscripts' from source: role '' defaults 32134 1727204442.93800: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 32134 1727204442.94206: variable 'network_connections' from source: task vars 32134 1727204442.94211: variable 'interface' from source: set_fact 32134 1727204442.94263: variable 'interface' from source: set_fact 32134 1727204442.94269: variable 'interface' from source: set_fact 32134 1727204442.94326: variable 'interface' from source: set_fact 32134 1727204442.94335: variable 'ansible_distribution' from source: facts 32134 1727204442.94338: variable '__network_rh_distros' from source: role '' defaults 32134 1727204442.94346: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.94366: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 32134 1727204442.94513: variable 'ansible_distribution' from source: facts 32134 1727204442.94517: variable '__network_rh_distros' from source: role '' defaults 32134 1727204442.94529: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.94532: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 32134 1727204442.94674: variable 'ansible_distribution' from source: facts 32134 1727204442.94677: variable '__network_rh_distros' from source: role '' defaults 32134 1727204442.94684: variable 'ansible_distribution_major_version' from source: facts 32134 1727204442.94716: variable 'network_provider' from source: set_fact 32134 1727204442.94739: variable 'omit' from source: magic vars 32134 1727204442.94767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204442.94792: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204442.94809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204442.94828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204442.94839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204442.94869: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204442.94872: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204442.94879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204442.94964: Set connection var ansible_timeout to 10 32134 1727204442.94978: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204442.94981: Set connection var ansible_connection to ssh 32134 1727204442.94984: Set connection var ansible_shell_type to sh 32134 1727204442.94992: Set connection var ansible_shell_executable to /bin/sh 32134 1727204442.94999: Set connection var ansible_pipelining to False 32134 1727204442.95022: variable 'ansible_shell_executable' from source: unknown 32134 1727204442.95025: variable 'ansible_connection' from source: unknown 32134 1727204442.95028: variable 'ansible_module_compression' from source: unknown 32134 1727204442.95031: variable 'ansible_shell_type' from source: unknown 32134 1727204442.95035: variable 'ansible_shell_executable' from source: unknown 32134 1727204442.95039: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204442.95046: variable 'ansible_pipelining' from source: unknown 32134 1727204442.95049: variable 'ansible_timeout' from source: unknown 32134 1727204442.95053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204442.95143: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204442.95154: variable 'omit' from source: magic vars 32134 1727204442.95160: starting attempt loop 32134 1727204442.95163: running the handler 32134 1727204442.95234: variable 'ansible_facts' from source: unknown 32134 1727204442.95870: _low_level_execute_command(): starting 32134 1727204442.95876: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204442.96401: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204442.96429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204442.96433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204442.96482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204442.96485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204442.96488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204442.96547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204442.98342: stdout chunk (state=3): >>>/root <<< 32134 1727204442.98444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204442.98506: stderr chunk (state=3): >>><<< 32134 1727204442.98510: stdout chunk (state=3): >>><<< 32134 1727204442.98531: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204442.98543: _low_level_execute_command(): starting 32134 1727204442.98549: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204442.9853106-33330-88531086061366 `" && echo ansible-tmp-1727204442.9853106-33330-88531086061366="` echo /root/.ansible/tmp/ansible-tmp-1727204442.9853106-33330-88531086061366 `" ) && sleep 0' 32134 1727204442.99046: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204442.99050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204442.99052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32134 1727204442.99055: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204442.99057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204442.99113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204442.99116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204442.99167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204443.01200: stdout chunk (state=3): >>>ansible-tmp-1727204442.9853106-33330-88531086061366=/root/.ansible/tmp/ansible-tmp-1727204442.9853106-33330-88531086061366 <<< 32134 1727204443.01318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204443.01373: stderr chunk (state=3): >>><<< 32134 1727204443.01377: stdout chunk (state=3): >>><<< 32134 1727204443.01394: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204442.9853106-33330-88531086061366=/root/.ansible/tmp/ansible-tmp-1727204442.9853106-33330-88531086061366 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204443.01432: variable 'ansible_module_compression' from source: unknown 32134 1727204443.01482: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 32134 1727204443.01487: ANSIBALLZ: Acquiring lock 32134 1727204443.01491: ANSIBALLZ: Lock acquired: 140589353832608 32134 1727204443.01494: ANSIBALLZ: Creating module 32134 1727204443.35992: ANSIBALLZ: Writing module into payload 32134 1727204443.36296: ANSIBALLZ: Writing module 32134 1727204443.36301: ANSIBALLZ: Renaming module 32134 1727204443.36311: ANSIBALLZ: Done creating module 32134 1727204443.36316: variable 'ansible_facts' from source: unknown 32134 1727204443.36533: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204442.9853106-33330-88531086061366/AnsiballZ_systemd.py 32134 1727204443.36715: Sending initial data 32134 1727204443.36726: Sent initial data (155 bytes) 32134 1727204443.37407: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204443.37456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204443.37475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204443.37506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204443.37575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204443.39330: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204443.39392: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204443.39456: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpfm0dxj1l /root/.ansible/tmp/ansible-tmp-1727204442.9853106-33330-88531086061366/AnsiballZ_systemd.py <<< 32134 1727204443.39471: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204442.9853106-33330-88531086061366/AnsiballZ_systemd.py" <<< 32134 1727204443.39499: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpfm0dxj1l" to remote "/root/.ansible/tmp/ansible-tmp-1727204442.9853106-33330-88531086061366/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204442.9853106-33330-88531086061366/AnsiballZ_systemd.py" <<< 32134 1727204443.42199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204443.42244: stderr chunk (state=3): >>><<< 32134 1727204443.42255: stdout chunk (state=3): >>><<< 32134 1727204443.42283: done transferring module to remote 32134 1727204443.42321: _low_level_execute_command(): starting 32134 1727204443.42324: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204442.9853106-33330-88531086061366/ /root/.ansible/tmp/ansible-tmp-1727204442.9853106-33330-88531086061366/AnsiballZ_systemd.py && sleep 0' 32134 1727204443.43100: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204443.43144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204443.43175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204443.45223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204443.45226: stdout chunk (state=3): >>><<< 32134 1727204443.45229: stderr chunk (state=3): >>><<< 32134 1727204443.45336: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204443.45341: _low_level_execute_command(): starting 32134 1727204443.45344: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204442.9853106-33330-88531086061366/AnsiballZ_systemd.py && sleep 0' 32134 1727204443.46009: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 32134 1727204443.46036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204443.46059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204443.46140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204443.79051: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4460544", "MemoryAvailable": "infinity", "CPUUsageNSec": "1571237000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 32134 1727204443.79099: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 32134 1727204443.81144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204443.81164: stderr chunk (state=3): >>><<< 32134 1727204443.81174: stdout chunk (state=3): >>><<< 32134 1727204443.81203: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4460544", "MemoryAvailable": "infinity", "CPUUsageNSec": "1571237000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204443.81531: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204442.9853106-33330-88531086061366/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204443.81572: _low_level_execute_command(): starting 32134 1727204443.81583: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204442.9853106-33330-88531086061366/ > /dev/null 2>&1 && sleep 0' 32134 1727204443.82275: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204443.82293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204443.82308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204443.82345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204443.82365: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204443.82451: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204443.82494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204443.82522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204443.82539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204443.82619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204443.84707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204443.84754: stderr chunk (state=3): >>><<< 32134 1727204443.84764: stdout chunk (state=3): >>><<< 32134 1727204443.84895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204443.84899: handler run complete 32134 1727204443.84901: attempt loop complete, returning result 32134 1727204443.84904: _execute() done 32134 1727204443.84906: dumping result to json 32134 1727204443.84933: done dumping result, returning 32134 1727204443.84949: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-753f-5162-000000000023] 32134 1727204443.84959: sending task result for task 12b410aa-8751-753f-5162-000000000023 32134 1727204443.85595: done sending task result for task 12b410aa-8751-753f-5162-000000000023 32134 1727204443.85599: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32134 1727204443.85659: no more pending results, returning what we have 32134 1727204443.85663: results queue empty 32134 1727204443.85664: checking for any_errors_fatal 32134 1727204443.85675: done checking for any_errors_fatal 32134 1727204443.85676: checking for max_fail_percentage 32134 1727204443.85678: done checking for max_fail_percentage 32134 1727204443.85679: checking to see if all hosts have failed and the running result is not ok 32134 1727204443.85680: done checking to see if all hosts have failed 32134 1727204443.85681: getting the remaining hosts for this loop 32134 1727204443.85683: done getting the remaining hosts for this loop 32134 1727204443.85688: getting the next task for host managed-node2 32134 1727204443.85697: done getting next task for host managed-node2 32134 1727204443.85703: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 32134 1727204443.85706: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204443.85723: getting variables 32134 1727204443.85725: in VariableManager get_vars() 32134 1727204443.85765: Calling all_inventory to load vars for managed-node2 32134 1727204443.85769: Calling groups_inventory to load vars for managed-node2 32134 1727204443.85772: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204443.85784: Calling all_plugins_play to load vars for managed-node2 32134 1727204443.85787: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204443.85997: Calling groups_plugins_play to load vars for managed-node2 32134 1727204443.88384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204443.91451: done with get_vars() 32134 1727204443.91494: done getting variables 32134 1727204443.91571: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:00:43 -0400 (0:00:01.053) 0:00:18.320 ***** 32134 1727204443.91614: entering _queue_task() for managed-node2/service 32134 1727204443.91979: worker is 1 (out of 1 available) 32134 1727204443.91994: exiting _queue_task() for managed-node2/service 32134 1727204443.92008: done queuing things up, now waiting for results queue to drain 32134 1727204443.92010: waiting for pending results... 32134 1727204443.92326: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 32134 1727204443.92502: in run() - task 12b410aa-8751-753f-5162-000000000024 32134 1727204443.92529: variable 'ansible_search_path' from source: unknown 32134 1727204443.92540: variable 'ansible_search_path' from source: unknown 32134 1727204443.92593: calling self._execute() 32134 1727204443.92702: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204443.92721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204443.92739: variable 'omit' from source: magic vars 32134 1727204443.93182: variable 'ansible_distribution_major_version' from source: facts 32134 1727204443.93395: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204443.93400: variable 'network_provider' from source: set_fact 32134 1727204443.93403: Evaluated conditional (network_provider == "nm"): True 32134 1727204443.93475: variable '__network_wpa_supplicant_required' from source: role '' defaults 32134 1727204443.93598: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32134 1727204443.93865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204443.96515: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204443.96605: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204443.96657: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204443.96709: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204443.96774: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204443.96864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204443.96915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204443.96990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204443.97016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204443.97039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204443.97106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204443.97145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204443.97181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204443.97245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204443.97318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204443.97335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204443.97368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204443.97406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204443.97466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204443.97488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204443.97686: variable 'network_connections' from source: task vars 32134 1727204443.97751: variable 'interface' from source: set_fact 32134 1727204443.97815: variable 'interface' from source: set_fact 32134 1727204443.97831: variable 'interface' from source: set_fact 32134 1727204443.97918: variable 'interface' from source: set_fact 32134 1727204443.98023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204443.98243: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204443.98407: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204443.98411: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204443.98416: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204443.98446: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204443.98478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204443.98525: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204443.98563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204443.98630: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204443.99153: variable 'network_connections' from source: task vars 32134 1727204443.99166: variable 'interface' from source: set_fact 32134 1727204443.99247: variable 'interface' from source: set_fact 32134 1727204443.99260: variable 'interface' from source: set_fact 32134 1727204443.99339: variable 'interface' from source: set_fact 32134 1727204443.99393: Evaluated conditional (__network_wpa_supplicant_required): False 32134 1727204443.99497: when evaluation is False, skipping this task 32134 1727204443.99500: _execute() done 32134 1727204443.99514: dumping result to json 32134 1727204443.99517: done dumping result, returning 32134 1727204443.99520: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-753f-5162-000000000024] 32134 1727204443.99522: sending task result for task 12b410aa-8751-753f-5162-000000000024 32134 1727204443.99597: done sending task result for task 12b410aa-8751-753f-5162-000000000024 32134 1727204443.99694: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 32134 1727204443.99755: no more pending results, returning what we have 32134 1727204443.99759: results queue empty 32134 1727204443.99761: checking for any_errors_fatal 32134 1727204443.99791: done checking for any_errors_fatal 32134 1727204443.99793: checking for max_fail_percentage 32134 1727204443.99795: done checking for max_fail_percentage 32134 1727204443.99796: checking to see if all hosts have failed and the running result is not ok 32134 1727204443.99797: done checking to see if all hosts have failed 32134 1727204443.99798: getting the remaining hosts for this loop 32134 1727204443.99800: done getting the remaining hosts for this loop 32134 1727204443.99805: getting the next task for host managed-node2 32134 1727204443.99816: done getting next task for host managed-node2 32134 1727204443.99821: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 32134 1727204443.99825: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204443.99842: getting variables 32134 1727204443.99844: in VariableManager get_vars() 32134 1727204444.00098: Calling all_inventory to load vars for managed-node2 32134 1727204444.00103: Calling groups_inventory to load vars for managed-node2 32134 1727204444.00107: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204444.00121: Calling all_plugins_play to load vars for managed-node2 32134 1727204444.00125: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204444.00129: Calling groups_plugins_play to load vars for managed-node2 32134 1727204444.02417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204444.05436: done with get_vars() 32134 1727204444.05471: done getting variables 32134 1727204444.05546: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:00:44 -0400 (0:00:00.139) 0:00:18.459 ***** 32134 1727204444.05583: entering _queue_task() for managed-node2/service 32134 1727204444.05931: worker is 1 (out of 1 available) 32134 1727204444.05945: exiting _queue_task() for managed-node2/service 32134 1727204444.05957: done queuing things up, now waiting for results queue to drain 32134 1727204444.05959: waiting for pending results... 32134 1727204444.06276: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 32134 1727204444.06451: in run() - task 12b410aa-8751-753f-5162-000000000025 32134 1727204444.06472: variable 'ansible_search_path' from source: unknown 32134 1727204444.06595: variable 'ansible_search_path' from source: unknown 32134 1727204444.06599: calling self._execute() 32134 1727204444.06637: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204444.06651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204444.06668: variable 'omit' from source: magic vars 32134 1727204444.07108: variable 'ansible_distribution_major_version' from source: facts 32134 1727204444.07130: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204444.07280: variable 'network_provider' from source: set_fact 32134 1727204444.07295: Evaluated conditional (network_provider == "initscripts"): False 32134 1727204444.07305: when evaluation is False, skipping this task 32134 1727204444.07316: _execute() done 32134 1727204444.07325: dumping result to json 32134 1727204444.07334: done dumping result, returning 32134 1727204444.07346: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-753f-5162-000000000025] 32134 1727204444.07357: sending task result for task 12b410aa-8751-753f-5162-000000000025 32134 1727204444.07558: done sending task result for task 12b410aa-8751-753f-5162-000000000025 32134 1727204444.07562: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32134 1727204444.07637: no more pending results, returning what we have 32134 1727204444.07642: results queue empty 32134 1727204444.07643: checking for any_errors_fatal 32134 1727204444.07657: done checking for any_errors_fatal 32134 1727204444.07658: checking for max_fail_percentage 32134 1727204444.07661: done checking for max_fail_percentage 32134 1727204444.07662: checking to see if all hosts have failed and the running result is not ok 32134 1727204444.07663: done checking to see if all hosts have failed 32134 1727204444.07664: getting the remaining hosts for this loop 32134 1727204444.07666: done getting the remaining hosts for this loop 32134 1727204444.07672: getting the next task for host managed-node2 32134 1727204444.07681: done getting next task for host managed-node2 32134 1727204444.07686: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 32134 1727204444.07693: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204444.07718: getting variables 32134 1727204444.07721: in VariableManager get_vars() 32134 1727204444.07766: Calling all_inventory to load vars for managed-node2 32134 1727204444.07770: Calling groups_inventory to load vars for managed-node2 32134 1727204444.07774: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204444.08018: Calling all_plugins_play to load vars for managed-node2 32134 1727204444.08024: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204444.08029: Calling groups_plugins_play to load vars for managed-node2 32134 1727204444.10192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204444.13139: done with get_vars() 32134 1727204444.13186: done getting variables 32134 1727204444.13266: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:00:44 -0400 (0:00:00.077) 0:00:18.537 ***** 32134 1727204444.13311: entering _queue_task() for managed-node2/copy 32134 1727204444.13692: worker is 1 (out of 1 available) 32134 1727204444.13707: exiting _queue_task() for managed-node2/copy 32134 1727204444.13724: done queuing things up, now waiting for results queue to drain 32134 1727204444.13726: waiting for pending results... 32134 1727204444.14114: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 32134 1727204444.14216: in run() - task 12b410aa-8751-753f-5162-000000000026 32134 1727204444.14240: variable 'ansible_search_path' from source: unknown 32134 1727204444.14249: variable 'ansible_search_path' from source: unknown 32134 1727204444.14297: calling self._execute() 32134 1727204444.14406: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204444.14430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204444.14448: variable 'omit' from source: magic vars 32134 1727204444.14900: variable 'ansible_distribution_major_version' from source: facts 32134 1727204444.14921: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204444.15082: variable 'network_provider' from source: set_fact 32134 1727204444.15096: Evaluated conditional (network_provider == "initscripts"): False 32134 1727204444.15105: when evaluation is False, skipping this task 32134 1727204444.15115: _execute() done 32134 1727204444.15186: dumping result to json 32134 1727204444.15189: done dumping result, returning 32134 1727204444.15194: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-753f-5162-000000000026] 32134 1727204444.15197: sending task result for task 12b410aa-8751-753f-5162-000000000026 32134 1727204444.15279: done sending task result for task 12b410aa-8751-753f-5162-000000000026 32134 1727204444.15284: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 32134 1727204444.15345: no more pending results, returning what we have 32134 1727204444.15350: results queue empty 32134 1727204444.15351: checking for any_errors_fatal 32134 1727204444.15360: done checking for any_errors_fatal 32134 1727204444.15361: checking for max_fail_percentage 32134 1727204444.15363: done checking for max_fail_percentage 32134 1727204444.15364: checking to see if all hosts have failed and the running result is not ok 32134 1727204444.15365: done checking to see if all hosts have failed 32134 1727204444.15366: getting the remaining hosts for this loop 32134 1727204444.15368: done getting the remaining hosts for this loop 32134 1727204444.15372: getting the next task for host managed-node2 32134 1727204444.15381: done getting next task for host managed-node2 32134 1727204444.15386: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 32134 1727204444.15391: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204444.15410: getting variables 32134 1727204444.15414: in VariableManager get_vars() 32134 1727204444.15458: Calling all_inventory to load vars for managed-node2 32134 1727204444.15461: Calling groups_inventory to load vars for managed-node2 32134 1727204444.15464: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204444.15478: Calling all_plugins_play to load vars for managed-node2 32134 1727204444.15482: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204444.15486: Calling groups_plugins_play to load vars for managed-node2 32134 1727204444.18094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204444.21126: done with get_vars() 32134 1727204444.21170: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:00:44 -0400 (0:00:00.079) 0:00:18.616 ***** 32134 1727204444.21291: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 32134 1727204444.21294: Creating lock for fedora.linux_system_roles.network_connections 32134 1727204444.21714: worker is 1 (out of 1 available) 32134 1727204444.21729: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 32134 1727204444.21743: done queuing things up, now waiting for results queue to drain 32134 1727204444.21745: waiting for pending results... 32134 1727204444.22213: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 32134 1727204444.22262: in run() - task 12b410aa-8751-753f-5162-000000000027 32134 1727204444.22282: variable 'ansible_search_path' from source: unknown 32134 1727204444.22292: variable 'ansible_search_path' from source: unknown 32134 1727204444.22397: calling self._execute() 32134 1727204444.22472: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204444.22487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204444.22507: variable 'omit' from source: magic vars 32134 1727204444.22977: variable 'ansible_distribution_major_version' from source: facts 32134 1727204444.23070: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204444.23074: variable 'omit' from source: magic vars 32134 1727204444.23094: variable 'omit' from source: magic vars 32134 1727204444.23313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204444.25974: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204444.26065: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204444.26120: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204444.26164: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204444.26208: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204444.26308: variable 'network_provider' from source: set_fact 32134 1727204444.26482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204444.26547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204444.26595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204444.26645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204444.26762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204444.26772: variable 'omit' from source: magic vars 32134 1727204444.26926: variable 'omit' from source: magic vars 32134 1727204444.27066: variable 'network_connections' from source: task vars 32134 1727204444.27084: variable 'interface' from source: set_fact 32134 1727204444.27177: variable 'interface' from source: set_fact 32134 1727204444.27192: variable 'interface' from source: set_fact 32134 1727204444.27276: variable 'interface' from source: set_fact 32134 1727204444.27479: variable 'omit' from source: magic vars 32134 1727204444.27496: variable '__lsr_ansible_managed' from source: task vars 32134 1727204444.27580: variable '__lsr_ansible_managed' from source: task vars 32134 1727204444.28187: Loaded config def from plugin (lookup/template) 32134 1727204444.28191: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 32134 1727204444.28196: File lookup term: get_ansible_managed.j2 32134 1727204444.28199: variable 'ansible_search_path' from source: unknown 32134 1727204444.28202: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 32134 1727204444.28206: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 32134 1727204444.28222: variable 'ansible_search_path' from source: unknown 32134 1727204444.39962: variable 'ansible_managed' from source: unknown 32134 1727204444.40225: variable 'omit' from source: magic vars 32134 1727204444.40298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204444.40312: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204444.40342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204444.40369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204444.40395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204444.40431: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204444.40495: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204444.40498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204444.40594: Set connection var ansible_timeout to 10 32134 1727204444.40626: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204444.40637: Set connection var ansible_connection to ssh 32134 1727204444.40652: Set connection var ansible_shell_type to sh 32134 1727204444.40667: Set connection var ansible_shell_executable to /bin/sh 32134 1727204444.40680: Set connection var ansible_pipelining to False 32134 1727204444.40712: variable 'ansible_shell_executable' from source: unknown 32134 1727204444.40720: variable 'ansible_connection' from source: unknown 32134 1727204444.40733: variable 'ansible_module_compression' from source: unknown 32134 1727204444.40757: variable 'ansible_shell_type' from source: unknown 32134 1727204444.40760: variable 'ansible_shell_executable' from source: unknown 32134 1727204444.40762: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204444.40764: variable 'ansible_pipelining' from source: unknown 32134 1727204444.40843: variable 'ansible_timeout' from source: unknown 32134 1727204444.40846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204444.40953: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204444.40983: variable 'omit' from source: magic vars 32134 1727204444.40998: starting attempt loop 32134 1727204444.41005: running the handler 32134 1727204444.41023: _low_level_execute_command(): starting 32134 1727204444.41035: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204444.41827: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204444.41843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204444.41875: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204444.41934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204444.42003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204444.42018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204444.42110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204444.43882: stdout chunk (state=3): >>>/root <<< 32134 1727204444.44202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204444.44206: stdout chunk (state=3): >>><<< 32134 1727204444.44208: stderr chunk (state=3): >>><<< 32134 1727204444.44211: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204444.44218: _low_level_execute_command(): starting 32134 1727204444.44226: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204444.4420211-33359-25665041642878 `" && echo ansible-tmp-1727204444.4420211-33359-25665041642878="` echo /root/.ansible/tmp/ansible-tmp-1727204444.4420211-33359-25665041642878 `" ) && sleep 0' 32134 1727204444.45003: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204444.45027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204444.45102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204444.47095: stdout chunk (state=3): >>>ansible-tmp-1727204444.4420211-33359-25665041642878=/root/.ansible/tmp/ansible-tmp-1727204444.4420211-33359-25665041642878 <<< 32134 1727204444.47304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204444.47308: stdout chunk (state=3): >>><<< 32134 1727204444.47310: stderr chunk (state=3): >>><<< 32134 1727204444.47496: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204444.4420211-33359-25665041642878=/root/.ansible/tmp/ansible-tmp-1727204444.4420211-33359-25665041642878 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204444.47500: variable 'ansible_module_compression' from source: unknown 32134 1727204444.47502: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 32134 1727204444.47504: ANSIBALLZ: Acquiring lock 32134 1727204444.47507: ANSIBALLZ: Lock acquired: 140589350033536 32134 1727204444.47509: ANSIBALLZ: Creating module 32134 1727204444.74836: ANSIBALLZ: Writing module into payload 32134 1727204444.75299: ANSIBALLZ: Writing module 32134 1727204444.75339: ANSIBALLZ: Renaming module 32134 1727204444.75352: ANSIBALLZ: Done creating module 32134 1727204444.75385: variable 'ansible_facts' from source: unknown 32134 1727204444.75495: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204444.4420211-33359-25665041642878/AnsiballZ_network_connections.py 32134 1727204444.75724: Sending initial data 32134 1727204444.75727: Sent initial data (167 bytes) 32134 1727204444.76328: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204444.76407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204444.76470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204444.76487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204444.76517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204444.76598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204444.78336: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204444.78408: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204444.78466: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpys73i32b /root/.ansible/tmp/ansible-tmp-1727204444.4420211-33359-25665041642878/AnsiballZ_network_connections.py <<< 32134 1727204444.78469: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204444.4420211-33359-25665041642878/AnsiballZ_network_connections.py" <<< 32134 1727204444.78514: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpys73i32b" to remote "/root/.ansible/tmp/ansible-tmp-1727204444.4420211-33359-25665041642878/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204444.4420211-33359-25665041642878/AnsiballZ_network_connections.py" <<< 32134 1727204444.80271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204444.80315: stderr chunk (state=3): >>><<< 32134 1727204444.80326: stdout chunk (state=3): >>><<< 32134 1727204444.80362: done transferring module to remote 32134 1727204444.80384: _low_level_execute_command(): starting 32134 1727204444.80406: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204444.4420211-33359-25665041642878/ /root/.ansible/tmp/ansible-tmp-1727204444.4420211-33359-25665041642878/AnsiballZ_network_connections.py && sleep 0' 32134 1727204444.81095: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204444.81114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204444.81132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204444.81162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204444.81207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204444.81286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204444.81296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204444.81301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204444.81379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204444.83393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204444.83397: stdout chunk (state=3): >>><<< 32134 1727204444.83399: stderr chunk (state=3): >>><<< 32134 1727204444.83427: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204444.83436: _low_level_execute_command(): starting 32134 1727204444.83445: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204444.4420211-33359-25665041642878/AnsiballZ_network_connections.py && sleep 0' 32134 1727204444.84100: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204444.84120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204444.84137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204444.84166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204444.84205: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204444.84227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204444.84321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204444.84346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204444.84408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204444.84452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204445.14517: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 2e4ab50e-0a87-42ab-af52-2be8774b7af4\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 32134 1727204445.16596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204445.16660: stderr chunk (state=3): >>><<< 32134 1727204445.16664: stdout chunk (state=3): >>><<< 32134 1727204445.16680: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 2e4ab50e-0a87-42ab-af52-2be8774b7af4\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204445.16722: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'type': 'ethernet', 'ip': {'ipv6_disabled': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204444.4420211-33359-25665041642878/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204445.16735: _low_level_execute_command(): starting 32134 1727204445.16742: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204444.4420211-33359-25665041642878/ > /dev/null 2>&1 && sleep 0' 32134 1727204445.17239: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204445.17242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204445.17245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204445.17247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204445.17249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204445.17299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204445.17318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204445.17355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204445.19361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204445.19414: stderr chunk (state=3): >>><<< 32134 1727204445.19418: stdout chunk (state=3): >>><<< 32134 1727204445.19436: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204445.19444: handler run complete 32134 1727204445.19474: attempt loop complete, returning result 32134 1727204445.19477: _execute() done 32134 1727204445.19479: dumping result to json 32134 1727204445.19486: done dumping result, returning 32134 1727204445.19497: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-753f-5162-000000000027] 32134 1727204445.19503: sending task result for task 12b410aa-8751-753f-5162-000000000027 32134 1727204445.19615: done sending task result for task 12b410aa-8751-753f-5162-000000000027 32134 1727204445.19618: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "ethtest0", "ip": { "ipv6_disabled": true }, "name": "ethtest0", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 2e4ab50e-0a87-42ab-af52-2be8774b7af4 32134 1727204445.19741: no more pending results, returning what we have 32134 1727204445.19745: results queue empty 32134 1727204445.19746: checking for any_errors_fatal 32134 1727204445.19761: done checking for any_errors_fatal 32134 1727204445.19762: checking for max_fail_percentage 32134 1727204445.19764: done checking for max_fail_percentage 32134 1727204445.19765: checking to see if all hosts have failed and the running result is not ok 32134 1727204445.19766: done checking to see if all hosts have failed 32134 1727204445.19767: getting the remaining hosts for this loop 32134 1727204445.19769: done getting the remaining hosts for this loop 32134 1727204445.19773: getting the next task for host managed-node2 32134 1727204445.19780: done getting next task for host managed-node2 32134 1727204445.19784: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 32134 1727204445.19788: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204445.19808: getting variables 32134 1727204445.19810: in VariableManager get_vars() 32134 1727204445.19851: Calling all_inventory to load vars for managed-node2 32134 1727204445.19854: Calling groups_inventory to load vars for managed-node2 32134 1727204445.19857: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204445.19867: Calling all_plugins_play to load vars for managed-node2 32134 1727204445.19870: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204445.19875: Calling groups_plugins_play to load vars for managed-node2 32134 1727204445.21241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204445.22817: done with get_vars() 32134 1727204445.22846: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:00:45 -0400 (0:00:01.016) 0:00:19.633 ***** 32134 1727204445.22924: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 32134 1727204445.22926: Creating lock for fedora.linux_system_roles.network_state 32134 1727204445.23209: worker is 1 (out of 1 available) 32134 1727204445.23225: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 32134 1727204445.23240: done queuing things up, now waiting for results queue to drain 32134 1727204445.23242: waiting for pending results... 32134 1727204445.23453: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 32134 1727204445.23559: in run() - task 12b410aa-8751-753f-5162-000000000028 32134 1727204445.23574: variable 'ansible_search_path' from source: unknown 32134 1727204445.23578: variable 'ansible_search_path' from source: unknown 32134 1727204445.23614: calling self._execute() 32134 1727204445.23700: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204445.23704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204445.23716: variable 'omit' from source: magic vars 32134 1727204445.24034: variable 'ansible_distribution_major_version' from source: facts 32134 1727204445.24045: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204445.24153: variable 'network_state' from source: role '' defaults 32134 1727204445.24163: Evaluated conditional (network_state != {}): False 32134 1727204445.24166: when evaluation is False, skipping this task 32134 1727204445.24171: _execute() done 32134 1727204445.24175: dumping result to json 32134 1727204445.24185: done dumping result, returning 32134 1727204445.24188: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-753f-5162-000000000028] 32134 1727204445.24196: sending task result for task 12b410aa-8751-753f-5162-000000000028 32134 1727204445.24293: done sending task result for task 12b410aa-8751-753f-5162-000000000028 32134 1727204445.24297: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32134 1727204445.24354: no more pending results, returning what we have 32134 1727204445.24358: results queue empty 32134 1727204445.24359: checking for any_errors_fatal 32134 1727204445.24372: done checking for any_errors_fatal 32134 1727204445.24373: checking for max_fail_percentage 32134 1727204445.24374: done checking for max_fail_percentage 32134 1727204445.24375: checking to see if all hosts have failed and the running result is not ok 32134 1727204445.24376: done checking to see if all hosts have failed 32134 1727204445.24378: getting the remaining hosts for this loop 32134 1727204445.24379: done getting the remaining hosts for this loop 32134 1727204445.24383: getting the next task for host managed-node2 32134 1727204445.24391: done getting next task for host managed-node2 32134 1727204445.24396: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 32134 1727204445.24399: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204445.24417: getting variables 32134 1727204445.24419: in VariableManager get_vars() 32134 1727204445.24454: Calling all_inventory to load vars for managed-node2 32134 1727204445.24457: Calling groups_inventory to load vars for managed-node2 32134 1727204445.24460: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204445.24469: Calling all_plugins_play to load vars for managed-node2 32134 1727204445.24472: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204445.24475: Calling groups_plugins_play to load vars for managed-node2 32134 1727204445.25682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204445.27262: done with get_vars() 32134 1727204445.27283: done getting variables 32134 1727204445.27338: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:00:45 -0400 (0:00:00.044) 0:00:19.677 ***** 32134 1727204445.27364: entering _queue_task() for managed-node2/debug 32134 1727204445.27600: worker is 1 (out of 1 available) 32134 1727204445.27614: exiting _queue_task() for managed-node2/debug 32134 1727204445.27626: done queuing things up, now waiting for results queue to drain 32134 1727204445.27628: waiting for pending results... 32134 1727204445.27818: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 32134 1727204445.27923: in run() - task 12b410aa-8751-753f-5162-000000000029 32134 1727204445.27935: variable 'ansible_search_path' from source: unknown 32134 1727204445.27938: variable 'ansible_search_path' from source: unknown 32134 1727204445.27975: calling self._execute() 32134 1727204445.28050: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204445.28057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204445.28068: variable 'omit' from source: magic vars 32134 1727204445.28381: variable 'ansible_distribution_major_version' from source: facts 32134 1727204445.28392: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204445.28401: variable 'omit' from source: magic vars 32134 1727204445.28453: variable 'omit' from source: magic vars 32134 1727204445.28483: variable 'omit' from source: magic vars 32134 1727204445.28523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204445.28555: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204445.28573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204445.28591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204445.28602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204445.28635: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204445.28639: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204445.28641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204445.28725: Set connection var ansible_timeout to 10 32134 1727204445.28739: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204445.28743: Set connection var ansible_connection to ssh 32134 1727204445.28746: Set connection var ansible_shell_type to sh 32134 1727204445.28760: Set connection var ansible_shell_executable to /bin/sh 32134 1727204445.28764: Set connection var ansible_pipelining to False 32134 1727204445.28781: variable 'ansible_shell_executable' from source: unknown 32134 1727204445.28784: variable 'ansible_connection' from source: unknown 32134 1727204445.28787: variable 'ansible_module_compression' from source: unknown 32134 1727204445.28792: variable 'ansible_shell_type' from source: unknown 32134 1727204445.28796: variable 'ansible_shell_executable' from source: unknown 32134 1727204445.28800: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204445.28805: variable 'ansible_pipelining' from source: unknown 32134 1727204445.28809: variable 'ansible_timeout' from source: unknown 32134 1727204445.28817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204445.28938: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204445.28951: variable 'omit' from source: magic vars 32134 1727204445.28954: starting attempt loop 32134 1727204445.28957: running the handler 32134 1727204445.29064: variable '__network_connections_result' from source: set_fact 32134 1727204445.29112: handler run complete 32134 1727204445.29131: attempt loop complete, returning result 32134 1727204445.29134: _execute() done 32134 1727204445.29137: dumping result to json 32134 1727204445.29142: done dumping result, returning 32134 1727204445.29152: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-753f-5162-000000000029] 32134 1727204445.29157: sending task result for task 12b410aa-8751-753f-5162-000000000029 32134 1727204445.29246: done sending task result for task 12b410aa-8751-753f-5162-000000000029 32134 1727204445.29249: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 2e4ab50e-0a87-42ab-af52-2be8774b7af4" ] } 32134 1727204445.29339: no more pending results, returning what we have 32134 1727204445.29342: results queue empty 32134 1727204445.29343: checking for any_errors_fatal 32134 1727204445.29348: done checking for any_errors_fatal 32134 1727204445.29349: checking for max_fail_percentage 32134 1727204445.29351: done checking for max_fail_percentage 32134 1727204445.29352: checking to see if all hosts have failed and the running result is not ok 32134 1727204445.29353: done checking to see if all hosts have failed 32134 1727204445.29353: getting the remaining hosts for this loop 32134 1727204445.29355: done getting the remaining hosts for this loop 32134 1727204445.29366: getting the next task for host managed-node2 32134 1727204445.29371: done getting next task for host managed-node2 32134 1727204445.29375: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 32134 1727204445.29378: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204445.29391: getting variables 32134 1727204445.29392: in VariableManager get_vars() 32134 1727204445.29426: Calling all_inventory to load vars for managed-node2 32134 1727204445.29429: Calling groups_inventory to load vars for managed-node2 32134 1727204445.29431: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204445.29438: Calling all_plugins_play to load vars for managed-node2 32134 1727204445.29441: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204445.29443: Calling groups_plugins_play to load vars for managed-node2 32134 1727204445.30702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204445.32255: done with get_vars() 32134 1727204445.32275: done getting variables 32134 1727204445.32325: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:00:45 -0400 (0:00:00.049) 0:00:19.727 ***** 32134 1727204445.32351: entering _queue_task() for managed-node2/debug 32134 1727204445.32574: worker is 1 (out of 1 available) 32134 1727204445.32587: exiting _queue_task() for managed-node2/debug 32134 1727204445.32603: done queuing things up, now waiting for results queue to drain 32134 1727204445.32605: waiting for pending results... 32134 1727204445.32786: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 32134 1727204445.32897: in run() - task 12b410aa-8751-753f-5162-00000000002a 32134 1727204445.32910: variable 'ansible_search_path' from source: unknown 32134 1727204445.32916: variable 'ansible_search_path' from source: unknown 32134 1727204445.32950: calling self._execute() 32134 1727204445.33034: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204445.33039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204445.33053: variable 'omit' from source: magic vars 32134 1727204445.33375: variable 'ansible_distribution_major_version' from source: facts 32134 1727204445.33386: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204445.33394: variable 'omit' from source: magic vars 32134 1727204445.33443: variable 'omit' from source: magic vars 32134 1727204445.33475: variable 'omit' from source: magic vars 32134 1727204445.33516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204445.33544: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204445.33562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204445.33578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204445.33590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204445.33620: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204445.33624: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204445.33628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204445.33709: Set connection var ansible_timeout to 10 32134 1727204445.33726: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204445.33729: Set connection var ansible_connection to ssh 32134 1727204445.33732: Set connection var ansible_shell_type to sh 32134 1727204445.33739: Set connection var ansible_shell_executable to /bin/sh 32134 1727204445.33745: Set connection var ansible_pipelining to False 32134 1727204445.33764: variable 'ansible_shell_executable' from source: unknown 32134 1727204445.33767: variable 'ansible_connection' from source: unknown 32134 1727204445.33770: variable 'ansible_module_compression' from source: unknown 32134 1727204445.33774: variable 'ansible_shell_type' from source: unknown 32134 1727204445.33777: variable 'ansible_shell_executable' from source: unknown 32134 1727204445.33782: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204445.33787: variable 'ansible_pipelining' from source: unknown 32134 1727204445.33792: variable 'ansible_timeout' from source: unknown 32134 1727204445.33797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204445.33917: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204445.33940: variable 'omit' from source: magic vars 32134 1727204445.33944: starting attempt loop 32134 1727204445.33947: running the handler 32134 1727204445.33978: variable '__network_connections_result' from source: set_fact 32134 1727204445.34050: variable '__network_connections_result' from source: set_fact 32134 1727204445.34146: handler run complete 32134 1727204445.34170: attempt loop complete, returning result 32134 1727204445.34173: _execute() done 32134 1727204445.34176: dumping result to json 32134 1727204445.34182: done dumping result, returning 32134 1727204445.34192: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-753f-5162-00000000002a] 32134 1727204445.34197: sending task result for task 12b410aa-8751-753f-5162-00000000002a ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "ethtest0", "ip": { "ipv6_disabled": true }, "name": "ethtest0", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 2e4ab50e-0a87-42ab-af52-2be8774b7af4\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 2e4ab50e-0a87-42ab-af52-2be8774b7af4" ] } } 32134 1727204445.34397: no more pending results, returning what we have 32134 1727204445.34402: results queue empty 32134 1727204445.34403: checking for any_errors_fatal 32134 1727204445.34408: done checking for any_errors_fatal 32134 1727204445.34409: checking for max_fail_percentage 32134 1727204445.34413: done checking for max_fail_percentage 32134 1727204445.34414: checking to see if all hosts have failed and the running result is not ok 32134 1727204445.34415: done checking to see if all hosts have failed 32134 1727204445.34416: getting the remaining hosts for this loop 32134 1727204445.34418: done getting the remaining hosts for this loop 32134 1727204445.34421: getting the next task for host managed-node2 32134 1727204445.34427: done getting next task for host managed-node2 32134 1727204445.34430: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 32134 1727204445.34433: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204445.34444: getting variables 32134 1727204445.34446: in VariableManager get_vars() 32134 1727204445.34482: Calling all_inventory to load vars for managed-node2 32134 1727204445.34484: Calling groups_inventory to load vars for managed-node2 32134 1727204445.34486: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204445.34494: done sending task result for task 12b410aa-8751-753f-5162-00000000002a 32134 1727204445.34497: WORKER PROCESS EXITING 32134 1727204445.34505: Calling all_plugins_play to load vars for managed-node2 32134 1727204445.34507: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204445.34510: Calling groups_plugins_play to load vars for managed-node2 32134 1727204445.35816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204445.37680: done with get_vars() 32134 1727204445.37718: done getting variables 32134 1727204445.37796: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:00:45 -0400 (0:00:00.054) 0:00:19.782 ***** 32134 1727204445.37847: entering _queue_task() for managed-node2/debug 32134 1727204445.38283: worker is 1 (out of 1 available) 32134 1727204445.38299: exiting _queue_task() for managed-node2/debug 32134 1727204445.38398: done queuing things up, now waiting for results queue to drain 32134 1727204445.38401: waiting for pending results... 32134 1727204445.38642: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 32134 1727204445.38745: in run() - task 12b410aa-8751-753f-5162-00000000002b 32134 1727204445.38759: variable 'ansible_search_path' from source: unknown 32134 1727204445.38763: variable 'ansible_search_path' from source: unknown 32134 1727204445.38797: calling self._execute() 32134 1727204445.38874: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204445.38880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204445.38892: variable 'omit' from source: magic vars 32134 1727204445.39203: variable 'ansible_distribution_major_version' from source: facts 32134 1727204445.39217: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204445.39320: variable 'network_state' from source: role '' defaults 32134 1727204445.39331: Evaluated conditional (network_state != {}): False 32134 1727204445.39334: when evaluation is False, skipping this task 32134 1727204445.39337: _execute() done 32134 1727204445.39342: dumping result to json 32134 1727204445.39347: done dumping result, returning 32134 1727204445.39355: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-753f-5162-00000000002b] 32134 1727204445.39361: sending task result for task 12b410aa-8751-753f-5162-00000000002b 32134 1727204445.39456: done sending task result for task 12b410aa-8751-753f-5162-00000000002b 32134 1727204445.39459: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 32134 1727204445.39515: no more pending results, returning what we have 32134 1727204445.39519: results queue empty 32134 1727204445.39520: checking for any_errors_fatal 32134 1727204445.39528: done checking for any_errors_fatal 32134 1727204445.39529: checking for max_fail_percentage 32134 1727204445.39530: done checking for max_fail_percentage 32134 1727204445.39531: checking to see if all hosts have failed and the running result is not ok 32134 1727204445.39532: done checking to see if all hosts have failed 32134 1727204445.39533: getting the remaining hosts for this loop 32134 1727204445.39535: done getting the remaining hosts for this loop 32134 1727204445.39538: getting the next task for host managed-node2 32134 1727204445.39545: done getting next task for host managed-node2 32134 1727204445.39549: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 32134 1727204445.39553: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204445.39568: getting variables 32134 1727204445.39570: in VariableManager get_vars() 32134 1727204445.39605: Calling all_inventory to load vars for managed-node2 32134 1727204445.39608: Calling groups_inventory to load vars for managed-node2 32134 1727204445.39610: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204445.39623: Calling all_plugins_play to load vars for managed-node2 32134 1727204445.39626: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204445.39629: Calling groups_plugins_play to load vars for managed-node2 32134 1727204445.41283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204445.45385: done with get_vars() 32134 1727204445.45423: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:00:45 -0400 (0:00:00.078) 0:00:19.861 ***** 32134 1727204445.45748: entering _queue_task() for managed-node2/ping 32134 1727204445.45750: Creating lock for ping 32134 1727204445.46357: worker is 1 (out of 1 available) 32134 1727204445.46370: exiting _queue_task() for managed-node2/ping 32134 1727204445.46383: done queuing things up, now waiting for results queue to drain 32134 1727204445.46385: waiting for pending results... 32134 1727204445.47108: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 32134 1727204445.47355: in run() - task 12b410aa-8751-753f-5162-00000000002c 32134 1727204445.47361: variable 'ansible_search_path' from source: unknown 32134 1727204445.47365: variable 'ansible_search_path' from source: unknown 32134 1727204445.47573: calling self._execute() 32134 1727204445.47792: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204445.47798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204445.47801: variable 'omit' from source: magic vars 32134 1727204445.48673: variable 'ansible_distribution_major_version' from source: facts 32134 1727204445.48698: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204445.48785: variable 'omit' from source: magic vars 32134 1727204445.48871: variable 'omit' from source: magic vars 32134 1727204445.49039: variable 'omit' from source: magic vars 32134 1727204445.49097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204445.49169: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204445.49203: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204445.49230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204445.49246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204445.49280: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204445.49299: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204445.49307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204445.49432: Set connection var ansible_timeout to 10 32134 1727204445.49457: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204445.49465: Set connection var ansible_connection to ssh 32134 1727204445.49473: Set connection var ansible_shell_type to sh 32134 1727204445.49484: Set connection var ansible_shell_executable to /bin/sh 32134 1727204445.49499: Set connection var ansible_pipelining to False 32134 1727204445.49540: variable 'ansible_shell_executable' from source: unknown 32134 1727204445.49552: variable 'ansible_connection' from source: unknown 32134 1727204445.49561: variable 'ansible_module_compression' from source: unknown 32134 1727204445.49569: variable 'ansible_shell_type' from source: unknown 32134 1727204445.49576: variable 'ansible_shell_executable' from source: unknown 32134 1727204445.49584: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204445.49595: variable 'ansible_pipelining' from source: unknown 32134 1727204445.49603: variable 'ansible_timeout' from source: unknown 32134 1727204445.49617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204445.49887: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204445.49909: variable 'omit' from source: magic vars 32134 1727204445.49950: starting attempt loop 32134 1727204445.49954: running the handler 32134 1727204445.49956: _low_level_execute_command(): starting 32134 1727204445.49968: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204445.50761: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204445.50778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204445.50795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204445.50827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204445.50945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204445.50970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204445.51053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204445.52836: stdout chunk (state=3): >>>/root <<< 32134 1727204445.53056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204445.53059: stdout chunk (state=3): >>><<< 32134 1727204445.53062: stderr chunk (state=3): >>><<< 32134 1727204445.53205: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204445.53209: _low_level_execute_command(): starting 32134 1727204445.53215: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204445.530966-33391-187064346119580 `" && echo ansible-tmp-1727204445.530966-33391-187064346119580="` echo /root/.ansible/tmp/ansible-tmp-1727204445.530966-33391-187064346119580 `" ) && sleep 0' 32134 1727204445.53931: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204445.53967: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204445.54019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204445.54110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204445.54154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204445.54174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204445.54261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204445.56327: stdout chunk (state=3): >>>ansible-tmp-1727204445.530966-33391-187064346119580=/root/.ansible/tmp/ansible-tmp-1727204445.530966-33391-187064346119580 <<< 32134 1727204445.56507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204445.56535: stderr chunk (state=3): >>><<< 32134 1727204445.56548: stdout chunk (state=3): >>><<< 32134 1727204445.56572: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204445.530966-33391-187064346119580=/root/.ansible/tmp/ansible-tmp-1727204445.530966-33391-187064346119580 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204445.56638: variable 'ansible_module_compression' from source: unknown 32134 1727204445.56686: ANSIBALLZ: Using lock for ping 32134 1727204445.56696: ANSIBALLZ: Acquiring lock 32134 1727204445.56704: ANSIBALLZ: Lock acquired: 140589348074080 32134 1727204445.56712: ANSIBALLZ: Creating module 32134 1727204445.73464: ANSIBALLZ: Writing module into payload 32134 1727204445.73597: ANSIBALLZ: Writing module 32134 1727204445.73600: ANSIBALLZ: Renaming module 32134 1727204445.73603: ANSIBALLZ: Done creating module 32134 1727204445.73621: variable 'ansible_facts' from source: unknown 32134 1727204445.73707: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204445.530966-33391-187064346119580/AnsiballZ_ping.py 32134 1727204445.73987: Sending initial data 32134 1727204445.73998: Sent initial data (152 bytes) 32134 1727204445.74528: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204445.74606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204445.74614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204445.74643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204445.74671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204445.74723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204445.76474: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 32134 1727204445.76484: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204445.76523: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204445.76570: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpw8xk3s7a /root/.ansible/tmp/ansible-tmp-1727204445.530966-33391-187064346119580/AnsiballZ_ping.py <<< 32134 1727204445.76574: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204445.530966-33391-187064346119580/AnsiballZ_ping.py" <<< 32134 1727204445.76621: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpw8xk3s7a" to remote "/root/.ansible/tmp/ansible-tmp-1727204445.530966-33391-187064346119580/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204445.530966-33391-187064346119580/AnsiballZ_ping.py" <<< 32134 1727204445.77781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204445.77922: stderr chunk (state=3): >>><<< 32134 1727204445.77925: stdout chunk (state=3): >>><<< 32134 1727204445.77927: done transferring module to remote 32134 1727204445.77930: _low_level_execute_command(): starting 32134 1727204445.77932: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204445.530966-33391-187064346119580/ /root/.ansible/tmp/ansible-tmp-1727204445.530966-33391-187064346119580/AnsiballZ_ping.py && sleep 0' 32134 1727204445.78501: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204445.78532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204445.78550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204445.78652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204445.78688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204445.78719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204445.78758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204445.78828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204445.80891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204445.80894: stdout chunk (state=3): >>><<< 32134 1727204445.80897: stderr chunk (state=3): >>><<< 32134 1727204445.81021: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204445.81025: _low_level_execute_command(): starting 32134 1727204445.81027: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204445.530966-33391-187064346119580/AnsiballZ_ping.py && sleep 0' 32134 1727204445.81606: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204445.81625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204445.81651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204445.81767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204445.81793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204445.81878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204445.99032: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 32134 1727204446.00604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204446.00608: stdout chunk (state=3): >>><<< 32134 1727204446.00611: stderr chunk (state=3): >>><<< 32134 1727204446.00616: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204446.00622: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204445.530966-33391-187064346119580/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204446.00643: _low_level_execute_command(): starting 32134 1727204446.00654: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204445.530966-33391-187064346119580/ > /dev/null 2>&1 && sleep 0' 32134 1727204446.01388: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204446.01406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204446.01515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204446.01549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204446.01568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204446.01596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204446.01681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204446.03719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204446.03733: stdout chunk (state=3): >>><<< 32134 1727204446.03759: stderr chunk (state=3): >>><<< 32134 1727204446.03895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204446.03904: handler run complete 32134 1727204446.03907: attempt loop complete, returning result 32134 1727204446.03909: _execute() done 32134 1727204446.03914: dumping result to json 32134 1727204446.03916: done dumping result, returning 32134 1727204446.03918: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-753f-5162-00000000002c] 32134 1727204446.03920: sending task result for task 12b410aa-8751-753f-5162-00000000002c ok: [managed-node2] => { "changed": false, "ping": "pong" } 32134 1727204446.04068: no more pending results, returning what we have 32134 1727204446.04073: results queue empty 32134 1727204446.04074: checking for any_errors_fatal 32134 1727204446.04083: done checking for any_errors_fatal 32134 1727204446.04084: checking for max_fail_percentage 32134 1727204446.04086: done checking for max_fail_percentage 32134 1727204446.04087: checking to see if all hosts have failed and the running result is not ok 32134 1727204446.04088: done checking to see if all hosts have failed 32134 1727204446.04091: getting the remaining hosts for this loop 32134 1727204446.04093: done getting the remaining hosts for this loop 32134 1727204446.04097: getting the next task for host managed-node2 32134 1727204446.04110: done getting next task for host managed-node2 32134 1727204446.04115: ^ task is: TASK: meta (role_complete) 32134 1727204446.04119: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204446.04133: getting variables 32134 1727204446.04135: in VariableManager get_vars() 32134 1727204446.04179: Calling all_inventory to load vars for managed-node2 32134 1727204446.04183: Calling groups_inventory to load vars for managed-node2 32134 1727204446.04186: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204446.04193: done sending task result for task 12b410aa-8751-753f-5162-00000000002c 32134 1727204446.04196: WORKER PROCESS EXITING 32134 1727204446.04402: Calling all_plugins_play to load vars for managed-node2 32134 1727204446.04407: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204446.04421: Calling groups_plugins_play to load vars for managed-node2 32134 1727204446.07279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204446.10537: done with get_vars() 32134 1727204446.10592: done getting variables 32134 1727204446.10709: done queuing things up, now waiting for results queue to drain 32134 1727204446.10714: results queue empty 32134 1727204446.10716: checking for any_errors_fatal 32134 1727204446.10719: done checking for any_errors_fatal 32134 1727204446.10720: checking for max_fail_percentage 32134 1727204446.10722: done checking for max_fail_percentage 32134 1727204446.10723: checking to see if all hosts have failed and the running result is not ok 32134 1727204446.10724: done checking to see if all hosts have failed 32134 1727204446.10725: getting the remaining hosts for this loop 32134 1727204446.10726: done getting the remaining hosts for this loop 32134 1727204446.10729: getting the next task for host managed-node2 32134 1727204446.10734: done getting next task for host managed-node2 32134 1727204446.10737: ^ task is: TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it 32134 1727204446.10739: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204446.10741: getting variables 32134 1727204446.10742: in VariableManager get_vars() 32134 1727204446.10759: Calling all_inventory to load vars for managed-node2 32134 1727204446.10762: Calling groups_inventory to load vars for managed-node2 32134 1727204446.10764: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204446.10770: Calling all_plugins_play to load vars for managed-node2 32134 1727204446.10779: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204446.10783: Calling groups_plugins_play to load vars for managed-node2 32134 1727204446.17643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204446.20241: done with get_vars() 32134 1727204446.20267: done getting variables 32134 1727204446.20308: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:41 Tuesday 24 September 2024 15:00:46 -0400 (0:00:00.745) 0:00:20.607 ***** 32134 1727204446.20330: entering _queue_task() for managed-node2/assert 32134 1727204446.20658: worker is 1 (out of 1 available) 32134 1727204446.20671: exiting _queue_task() for managed-node2/assert 32134 1727204446.20685: done queuing things up, now waiting for results queue to drain 32134 1727204446.20687: waiting for pending results... 32134 1727204446.20891: running TaskExecutor() for managed-node2/TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it 32134 1727204446.20976: in run() - task 12b410aa-8751-753f-5162-00000000005c 32134 1727204446.20988: variable 'ansible_search_path' from source: unknown 32134 1727204446.21033: calling self._execute() 32134 1727204446.21118: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204446.21129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204446.21140: variable 'omit' from source: magic vars 32134 1727204446.21476: variable 'ansible_distribution_major_version' from source: facts 32134 1727204446.21488: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204446.21587: variable '__network_connections_result' from source: set_fact 32134 1727204446.21605: Evaluated conditional (__network_connections_result.failed): False 32134 1727204446.21610: when evaluation is False, skipping this task 32134 1727204446.21613: _execute() done 32134 1727204446.21619: dumping result to json 32134 1727204446.21623: done dumping result, returning 32134 1727204446.21630: done running TaskExecutor() for managed-node2/TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it [12b410aa-8751-753f-5162-00000000005c] 32134 1727204446.21637: sending task result for task 12b410aa-8751-753f-5162-00000000005c 32134 1727204446.21743: done sending task result for task 12b410aa-8751-753f-5162-00000000005c 32134 1727204446.21746: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_connections_result.failed", "skip_reason": "Conditional result was False" } 32134 1727204446.21799: no more pending results, returning what we have 32134 1727204446.21803: results queue empty 32134 1727204446.21804: checking for any_errors_fatal 32134 1727204446.21806: done checking for any_errors_fatal 32134 1727204446.21807: checking for max_fail_percentage 32134 1727204446.21809: done checking for max_fail_percentage 32134 1727204446.21810: checking to see if all hosts have failed and the running result is not ok 32134 1727204446.21811: done checking to see if all hosts have failed 32134 1727204446.21812: getting the remaining hosts for this loop 32134 1727204446.21814: done getting the remaining hosts for this loop 32134 1727204446.21818: getting the next task for host managed-node2 32134 1727204446.21825: done getting next task for host managed-node2 32134 1727204446.21828: ^ task is: TASK: Verify nmcli connection ipv6.method 32134 1727204446.21831: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204446.21836: getting variables 32134 1727204446.21837: in VariableManager get_vars() 32134 1727204446.21878: Calling all_inventory to load vars for managed-node2 32134 1727204446.21882: Calling groups_inventory to load vars for managed-node2 32134 1727204446.21885: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204446.22003: Calling all_plugins_play to load vars for managed-node2 32134 1727204446.22008: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204446.22015: Calling groups_plugins_play to load vars for managed-node2 32134 1727204446.23649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204446.25223: done with get_vars() 32134 1727204446.25246: done getting variables 32134 1727204446.25329: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Verify nmcli connection ipv6.method] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:48 Tuesday 24 September 2024 15:00:46 -0400 (0:00:00.050) 0:00:20.657 ***** 32134 1727204446.25351: entering _queue_task() for managed-node2/shell 32134 1727204446.25353: Creating lock for shell 32134 1727204446.25610: worker is 1 (out of 1 available) 32134 1727204446.25628: exiting _queue_task() for managed-node2/shell 32134 1727204446.25641: done queuing things up, now waiting for results queue to drain 32134 1727204446.25643: waiting for pending results... 32134 1727204446.25834: running TaskExecutor() for managed-node2/TASK: Verify nmcli connection ipv6.method 32134 1727204446.25917: in run() - task 12b410aa-8751-753f-5162-00000000005d 32134 1727204446.25927: variable 'ansible_search_path' from source: unknown 32134 1727204446.25961: calling self._execute() 32134 1727204446.26050: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204446.26058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204446.26068: variable 'omit' from source: magic vars 32134 1727204446.26405: variable 'ansible_distribution_major_version' from source: facts 32134 1727204446.26420: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204446.26523: variable '__network_connections_result' from source: set_fact 32134 1727204446.26541: Evaluated conditional (not __network_connections_result.failed): True 32134 1727204446.26548: variable 'omit' from source: magic vars 32134 1727204446.26566: variable 'omit' from source: magic vars 32134 1727204446.26652: variable 'interface' from source: set_fact 32134 1727204446.26666: variable 'omit' from source: magic vars 32134 1727204446.26704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204446.26735: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204446.26761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204446.26778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204446.26790: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204446.26820: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204446.26823: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204446.26828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204446.26918: Set connection var ansible_timeout to 10 32134 1727204446.26929: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204446.26932: Set connection var ansible_connection to ssh 32134 1727204446.26935: Set connection var ansible_shell_type to sh 32134 1727204446.26942: Set connection var ansible_shell_executable to /bin/sh 32134 1727204446.26949: Set connection var ansible_pipelining to False 32134 1727204446.26970: variable 'ansible_shell_executable' from source: unknown 32134 1727204446.26975: variable 'ansible_connection' from source: unknown 32134 1727204446.26978: variable 'ansible_module_compression' from source: unknown 32134 1727204446.26980: variable 'ansible_shell_type' from source: unknown 32134 1727204446.26983: variable 'ansible_shell_executable' from source: unknown 32134 1727204446.26985: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204446.26993: variable 'ansible_pipelining' from source: unknown 32134 1727204446.26996: variable 'ansible_timeout' from source: unknown 32134 1727204446.27002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204446.27124: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204446.27134: variable 'omit' from source: magic vars 32134 1727204446.27140: starting attempt loop 32134 1727204446.27144: running the handler 32134 1727204446.27153: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204446.27170: _low_level_execute_command(): starting 32134 1727204446.27177: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204446.27740: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204446.27744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204446.27747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204446.27750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204446.27797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204446.27801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204446.27864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204446.29680: stdout chunk (state=3): >>>/root <<< 32134 1727204446.29796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204446.29852: stderr chunk (state=3): >>><<< 32134 1727204446.29855: stdout chunk (state=3): >>><<< 32134 1727204446.29877: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204446.29893: _low_level_execute_command(): starting 32134 1727204446.29902: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204446.2987678-33413-217586407682574 `" && echo ansible-tmp-1727204446.2987678-33413-217586407682574="` echo /root/.ansible/tmp/ansible-tmp-1727204446.2987678-33413-217586407682574 `" ) && sleep 0' 32134 1727204446.30350: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204446.30363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204446.30387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204446.30394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204446.30459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204446.30463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204446.30468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204446.30512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204446.32617: stdout chunk (state=3): >>>ansible-tmp-1727204446.2987678-33413-217586407682574=/root/.ansible/tmp/ansible-tmp-1727204446.2987678-33413-217586407682574 <<< 32134 1727204446.32737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204446.32785: stderr chunk (state=3): >>><<< 32134 1727204446.32792: stdout chunk (state=3): >>><<< 32134 1727204446.32810: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204446.2987678-33413-217586407682574=/root/.ansible/tmp/ansible-tmp-1727204446.2987678-33413-217586407682574 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204446.32839: variable 'ansible_module_compression' from source: unknown 32134 1727204446.32882: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32134 1727204446.32918: variable 'ansible_facts' from source: unknown 32134 1727204446.32982: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204446.2987678-33413-217586407682574/AnsiballZ_command.py 32134 1727204446.33100: Sending initial data 32134 1727204446.33104: Sent initial data (156 bytes) 32134 1727204446.33569: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204446.33573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204446.33576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204446.33578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204446.33632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204446.33635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204446.33679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204446.35379: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 32134 1727204446.35385: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204446.35415: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204446.35452: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpo0kvdfoh /root/.ansible/tmp/ansible-tmp-1727204446.2987678-33413-217586407682574/AnsiballZ_command.py <<< 32134 1727204446.35459: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204446.2987678-33413-217586407682574/AnsiballZ_command.py" <<< 32134 1727204446.35487: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpo0kvdfoh" to remote "/root/.ansible/tmp/ansible-tmp-1727204446.2987678-33413-217586407682574/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204446.2987678-33413-217586407682574/AnsiballZ_command.py" <<< 32134 1727204446.36272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204446.36327: stderr chunk (state=3): >>><<< 32134 1727204446.36330: stdout chunk (state=3): >>><<< 32134 1727204446.36352: done transferring module to remote 32134 1727204446.36366: _low_level_execute_command(): starting 32134 1727204446.36369: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204446.2987678-33413-217586407682574/ /root/.ansible/tmp/ansible-tmp-1727204446.2987678-33413-217586407682574/AnsiballZ_command.py && sleep 0' 32134 1727204446.36787: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204446.36826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204446.36830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204446.36832: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204446.36838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204446.36884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204446.36888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204446.36933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204446.38821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204446.38862: stderr chunk (state=3): >>><<< 32134 1727204446.38865: stdout chunk (state=3): >>><<< 32134 1727204446.38881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204446.38884: _low_level_execute_command(): starting 32134 1727204446.38892: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204446.2987678-33413-217586407682574/AnsiballZ_command.py && sleep 0' 32134 1727204446.39294: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204446.39336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204446.39340: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204446.39343: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204446.39345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204446.39391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204446.39395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204446.39445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204446.59070: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv6.method: disabled", "stderr": "+ nmcli connection show ethtest0\n+ grep ipv6.method", "rc": 0, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "start": "2024-09-24 15:00:46.569629", "end": "2024-09-24 15:00:46.589686", "delta": "0:00:00.020057", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32134 1727204446.60940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204446.60944: stdout chunk (state=3): >>><<< 32134 1727204446.60947: stderr chunk (state=3): >>><<< 32134 1727204446.60968: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv6.method: disabled", "stderr": "+ nmcli connection show ethtest0\n+ grep ipv6.method", "rc": 0, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "start": "2024-09-24 15:00:46.569629", "end": "2024-09-24 15:00:46.589686", "delta": "0:00:00.020057", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204446.61099: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204446.2987678-33413-217586407682574/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204446.61102: _low_level_execute_command(): starting 32134 1727204446.61105: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204446.2987678-33413-217586407682574/ > /dev/null 2>&1 && sleep 0' 32134 1727204446.61696: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204446.61716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204446.61740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204446.61829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204446.61832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204446.61835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204446.61870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204446.61874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204446.61928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204446.64093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204446.64097: stdout chunk (state=3): >>><<< 32134 1727204446.64099: stderr chunk (state=3): >>><<< 32134 1727204446.64104: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204446.64106: handler run complete 32134 1727204446.64108: Evaluated conditional (False): False 32134 1727204446.64110: attempt loop complete, returning result 32134 1727204446.64115: _execute() done 32134 1727204446.64117: dumping result to json 32134 1727204446.64119: done dumping result, returning 32134 1727204446.64122: done running TaskExecutor() for managed-node2/TASK: Verify nmcli connection ipv6.method [12b410aa-8751-753f-5162-00000000005d] 32134 1727204446.64124: sending task result for task 12b410aa-8751-753f-5162-00000000005d 32134 1727204446.64269: done sending task result for task 12b410aa-8751-753f-5162-00000000005d 32134 1727204446.64272: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "delta": "0:00:00.020057", "end": "2024-09-24 15:00:46.589686", "rc": 0, "start": "2024-09-24 15:00:46.569629" } STDOUT: ipv6.method: disabled STDERR: + nmcli connection show ethtest0 + grep ipv6.method 32134 1727204446.64580: no more pending results, returning what we have 32134 1727204446.64583: results queue empty 32134 1727204446.64584: checking for any_errors_fatal 32134 1727204446.64592: done checking for any_errors_fatal 32134 1727204446.64593: checking for max_fail_percentage 32134 1727204446.64595: done checking for max_fail_percentage 32134 1727204446.64596: checking to see if all hosts have failed and the running result is not ok 32134 1727204446.64597: done checking to see if all hosts have failed 32134 1727204446.64597: getting the remaining hosts for this loop 32134 1727204446.64599: done getting the remaining hosts for this loop 32134 1727204446.64607: getting the next task for host managed-node2 32134 1727204446.64616: done getting next task for host managed-node2 32134 1727204446.64619: ^ task is: TASK: Assert that ipv6.method disabled is configured correctly 32134 1727204446.64621: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204446.64624: getting variables 32134 1727204446.64626: in VariableManager get_vars() 32134 1727204446.64661: Calling all_inventory to load vars for managed-node2 32134 1727204446.64664: Calling groups_inventory to load vars for managed-node2 32134 1727204446.64667: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204446.64678: Calling all_plugins_play to load vars for managed-node2 32134 1727204446.64681: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204446.64684: Calling groups_plugins_play to load vars for managed-node2 32134 1727204446.67100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204446.70080: done with get_vars() 32134 1727204446.70116: done getting variables 32134 1727204446.70186: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that ipv6.method disabled is configured correctly] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:57 Tuesday 24 September 2024 15:00:46 -0400 (0:00:00.448) 0:00:21.106 ***** 32134 1727204446.70219: entering _queue_task() for managed-node2/assert 32134 1727204446.70582: worker is 1 (out of 1 available) 32134 1727204446.70799: exiting _queue_task() for managed-node2/assert 32134 1727204446.70812: done queuing things up, now waiting for results queue to drain 32134 1727204446.70814: waiting for pending results... 32134 1727204446.71045: running TaskExecutor() for managed-node2/TASK: Assert that ipv6.method disabled is configured correctly 32134 1727204446.71139: in run() - task 12b410aa-8751-753f-5162-00000000005e 32134 1727204446.71146: variable 'ansible_search_path' from source: unknown 32134 1727204446.71149: calling self._execute() 32134 1727204446.71222: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204446.71358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204446.71362: variable 'omit' from source: magic vars 32134 1727204446.71718: variable 'ansible_distribution_major_version' from source: facts 32134 1727204446.71732: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204446.71891: variable '__network_connections_result' from source: set_fact 32134 1727204446.71914: Evaluated conditional (not __network_connections_result.failed): True 32134 1727204446.71918: variable 'omit' from source: magic vars 32134 1727204446.71953: variable 'omit' from source: magic vars 32134 1727204446.71994: variable 'omit' from source: magic vars 32134 1727204446.72048: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204446.72095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204446.72129: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204446.72229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204446.72234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204446.72238: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204446.72240: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204446.72243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204446.72335: Set connection var ansible_timeout to 10 32134 1727204446.72355: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204446.72360: Set connection var ansible_connection to ssh 32134 1727204446.72364: Set connection var ansible_shell_type to sh 32134 1727204446.72375: Set connection var ansible_shell_executable to /bin/sh 32134 1727204446.72446: Set connection var ansible_pipelining to False 32134 1727204446.72450: variable 'ansible_shell_executable' from source: unknown 32134 1727204446.72453: variable 'ansible_connection' from source: unknown 32134 1727204446.72455: variable 'ansible_module_compression' from source: unknown 32134 1727204446.72458: variable 'ansible_shell_type' from source: unknown 32134 1727204446.72464: variable 'ansible_shell_executable' from source: unknown 32134 1727204446.72468: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204446.72470: variable 'ansible_pipelining' from source: unknown 32134 1727204446.72473: variable 'ansible_timeout' from source: unknown 32134 1727204446.72476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204446.72623: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204446.72632: variable 'omit' from source: magic vars 32134 1727204446.72645: starting attempt loop 32134 1727204446.72648: running the handler 32134 1727204446.72932: variable 'ipv6_method' from source: set_fact 32134 1727204446.72935: Evaluated conditional ('disabled' in ipv6_method.stdout): True 32134 1727204446.72937: handler run complete 32134 1727204446.72939: attempt loop complete, returning result 32134 1727204446.72942: _execute() done 32134 1727204446.72943: dumping result to json 32134 1727204446.72945: done dumping result, returning 32134 1727204446.72947: done running TaskExecutor() for managed-node2/TASK: Assert that ipv6.method disabled is configured correctly [12b410aa-8751-753f-5162-00000000005e] 32134 1727204446.72949: sending task result for task 12b410aa-8751-753f-5162-00000000005e 32134 1727204446.73228: done sending task result for task 12b410aa-8751-753f-5162-00000000005e 32134 1727204446.73233: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 32134 1727204446.73435: no more pending results, returning what we have 32134 1727204446.73439: results queue empty 32134 1727204446.73440: checking for any_errors_fatal 32134 1727204446.73447: done checking for any_errors_fatal 32134 1727204446.73448: checking for max_fail_percentage 32134 1727204446.73450: done checking for max_fail_percentage 32134 1727204446.73451: checking to see if all hosts have failed and the running result is not ok 32134 1727204446.73452: done checking to see if all hosts have failed 32134 1727204446.73454: getting the remaining hosts for this loop 32134 1727204446.73455: done getting the remaining hosts for this loop 32134 1727204446.73459: getting the next task for host managed-node2 32134 1727204446.73464: done getting next task for host managed-node2 32134 1727204446.73467: ^ task is: TASK: Set the connection_failed flag 32134 1727204446.73470: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204446.73473: getting variables 32134 1727204446.73474: in VariableManager get_vars() 32134 1727204446.73511: Calling all_inventory to load vars for managed-node2 32134 1727204446.73514: Calling groups_inventory to load vars for managed-node2 32134 1727204446.73518: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204446.73529: Calling all_plugins_play to load vars for managed-node2 32134 1727204446.73532: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204446.73536: Calling groups_plugins_play to load vars for managed-node2 32134 1727204446.75837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204446.78835: done with get_vars() 32134 1727204446.78875: done getting variables 32134 1727204446.78950: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set the connection_failed flag] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:64 Tuesday 24 September 2024 15:00:46 -0400 (0:00:00.087) 0:00:21.193 ***** 32134 1727204446.78983: entering _queue_task() for managed-node2/set_fact 32134 1727204446.79359: worker is 1 (out of 1 available) 32134 1727204446.79375: exiting _queue_task() for managed-node2/set_fact 32134 1727204446.79592: done queuing things up, now waiting for results queue to drain 32134 1727204446.79595: waiting for pending results... 32134 1727204446.79811: running TaskExecutor() for managed-node2/TASK: Set the connection_failed flag 32134 1727204446.79851: in run() - task 12b410aa-8751-753f-5162-00000000005f 32134 1727204446.79855: variable 'ansible_search_path' from source: unknown 32134 1727204446.79933: calling self._execute() 32134 1727204446.80018: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204446.80022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204446.80039: variable 'omit' from source: magic vars 32134 1727204446.80585: variable 'ansible_distribution_major_version' from source: facts 32134 1727204446.80591: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204446.80666: variable '__network_connections_result' from source: set_fact 32134 1727204446.80686: Evaluated conditional (__network_connections_result.failed): False 32134 1727204446.80691: when evaluation is False, skipping this task 32134 1727204446.80703: _execute() done 32134 1727204446.80706: dumping result to json 32134 1727204446.80714: done dumping result, returning 32134 1727204446.80717: done running TaskExecutor() for managed-node2/TASK: Set the connection_failed flag [12b410aa-8751-753f-5162-00000000005f] 32134 1727204446.80726: sending task result for task 12b410aa-8751-753f-5162-00000000005f 32134 1727204446.80836: done sending task result for task 12b410aa-8751-753f-5162-00000000005f 32134 1727204446.80840: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_connections_result.failed", "skip_reason": "Conditional result was False" } 32134 1727204446.80900: no more pending results, returning what we have 32134 1727204446.80906: results queue empty 32134 1727204446.80908: checking for any_errors_fatal 32134 1727204446.80918: done checking for any_errors_fatal 32134 1727204446.80919: checking for max_fail_percentage 32134 1727204446.80921: done checking for max_fail_percentage 32134 1727204446.80921: checking to see if all hosts have failed and the running result is not ok 32134 1727204446.80923: done checking to see if all hosts have failed 32134 1727204446.80924: getting the remaining hosts for this loop 32134 1727204446.80925: done getting the remaining hosts for this loop 32134 1727204446.80930: getting the next task for host managed-node2 32134 1727204446.80943: done getting next task for host managed-node2 32134 1727204446.80946: ^ task is: TASK: meta (flush_handlers) 32134 1727204446.80950: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204446.80955: getting variables 32134 1727204446.80958: in VariableManager get_vars() 32134 1727204446.81004: Calling all_inventory to load vars for managed-node2 32134 1727204446.81008: Calling groups_inventory to load vars for managed-node2 32134 1727204446.81011: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204446.81029: Calling all_plugins_play to load vars for managed-node2 32134 1727204446.81033: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204446.81038: Calling groups_plugins_play to load vars for managed-node2 32134 1727204446.83385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204446.86354: done with get_vars() 32134 1727204446.86396: done getting variables 32134 1727204446.86475: in VariableManager get_vars() 32134 1727204446.86492: Calling all_inventory to load vars for managed-node2 32134 1727204446.86495: Calling groups_inventory to load vars for managed-node2 32134 1727204446.86498: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204446.86504: Calling all_plugins_play to load vars for managed-node2 32134 1727204446.86507: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204446.86511: Calling groups_plugins_play to load vars for managed-node2 32134 1727204446.88610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204446.91407: done with get_vars() 32134 1727204446.91452: done queuing things up, now waiting for results queue to drain 32134 1727204446.91455: results queue empty 32134 1727204446.91456: checking for any_errors_fatal 32134 1727204446.91460: done checking for any_errors_fatal 32134 1727204446.91461: checking for max_fail_percentage 32134 1727204446.91462: done checking for max_fail_percentage 32134 1727204446.91463: checking to see if all hosts have failed and the running result is not ok 32134 1727204446.91465: done checking to see if all hosts have failed 32134 1727204446.91466: getting the remaining hosts for this loop 32134 1727204446.91467: done getting the remaining hosts for this loop 32134 1727204446.91470: getting the next task for host managed-node2 32134 1727204446.91475: done getting next task for host managed-node2 32134 1727204446.91477: ^ task is: TASK: meta (flush_handlers) 32134 1727204446.91479: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204446.91482: getting variables 32134 1727204446.91483: in VariableManager get_vars() 32134 1727204446.91500: Calling all_inventory to load vars for managed-node2 32134 1727204446.91503: Calling groups_inventory to load vars for managed-node2 32134 1727204446.91506: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204446.91513: Calling all_plugins_play to load vars for managed-node2 32134 1727204446.91516: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204446.91520: Calling groups_plugins_play to load vars for managed-node2 32134 1727204446.93512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204446.96446: done with get_vars() 32134 1727204446.96479: done getting variables 32134 1727204446.96543: in VariableManager get_vars() 32134 1727204446.96558: Calling all_inventory to load vars for managed-node2 32134 1727204446.96561: Calling groups_inventory to load vars for managed-node2 32134 1727204446.96563: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204446.96569: Calling all_plugins_play to load vars for managed-node2 32134 1727204446.96572: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204446.96576: Calling groups_plugins_play to load vars for managed-node2 32134 1727204446.98470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204447.01304: done with get_vars() 32134 1727204447.01354: done queuing things up, now waiting for results queue to drain 32134 1727204447.01356: results queue empty 32134 1727204447.01357: checking for any_errors_fatal 32134 1727204447.01359: done checking for any_errors_fatal 32134 1727204447.01360: checking for max_fail_percentage 32134 1727204447.01362: done checking for max_fail_percentage 32134 1727204447.01363: checking to see if all hosts have failed and the running result is not ok 32134 1727204447.01364: done checking to see if all hosts have failed 32134 1727204447.01365: getting the remaining hosts for this loop 32134 1727204447.01366: done getting the remaining hosts for this loop 32134 1727204447.01369: getting the next task for host managed-node2 32134 1727204447.01380: done getting next task for host managed-node2 32134 1727204447.01381: ^ task is: None 32134 1727204447.01383: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204447.01384: done queuing things up, now waiting for results queue to drain 32134 1727204447.01386: results queue empty 32134 1727204447.01387: checking for any_errors_fatal 32134 1727204447.01387: done checking for any_errors_fatal 32134 1727204447.01388: checking for max_fail_percentage 32134 1727204447.01391: done checking for max_fail_percentage 32134 1727204447.01392: checking to see if all hosts have failed and the running result is not ok 32134 1727204447.01393: done checking to see if all hosts have failed 32134 1727204447.01396: getting the next task for host managed-node2 32134 1727204447.01399: done getting next task for host managed-node2 32134 1727204447.01400: ^ task is: None 32134 1727204447.01401: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204447.01467: in VariableManager get_vars() 32134 1727204447.01497: done with get_vars() 32134 1727204447.01505: in VariableManager get_vars() 32134 1727204447.01522: done with get_vars() 32134 1727204447.01528: variable 'omit' from source: magic vars 32134 1727204447.01665: variable 'profile' from source: play vars 32134 1727204447.01808: in VariableManager get_vars() 32134 1727204447.01825: done with get_vars() 32134 1727204447.01851: variable 'omit' from source: magic vars 32134 1727204447.01931: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 32134 1727204447.02892: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 32134 1727204447.02920: getting the remaining hosts for this loop 32134 1727204447.02922: done getting the remaining hosts for this loop 32134 1727204447.02925: getting the next task for host managed-node2 32134 1727204447.02929: done getting next task for host managed-node2 32134 1727204447.02931: ^ task is: TASK: Gathering Facts 32134 1727204447.02933: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204447.02935: getting variables 32134 1727204447.02937: in VariableManager get_vars() 32134 1727204447.02950: Calling all_inventory to load vars for managed-node2 32134 1727204447.02953: Calling groups_inventory to load vars for managed-node2 32134 1727204447.02956: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204447.02962: Calling all_plugins_play to load vars for managed-node2 32134 1727204447.02966: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204447.02970: Calling groups_plugins_play to load vars for managed-node2 32134 1727204447.05131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204447.07962: done with get_vars() 32134 1727204447.08005: done getting variables 32134 1727204447.08063: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 15:00:47 -0400 (0:00:00.291) 0:00:21.484 ***** 32134 1727204447.08097: entering _queue_task() for managed-node2/gather_facts 32134 1727204447.08464: worker is 1 (out of 1 available) 32134 1727204447.08475: exiting _queue_task() for managed-node2/gather_facts 32134 1727204447.08488: done queuing things up, now waiting for results queue to drain 32134 1727204447.08697: waiting for pending results... 32134 1727204447.08827: running TaskExecutor() for managed-node2/TASK: Gathering Facts 32134 1727204447.09034: in run() - task 12b410aa-8751-753f-5162-000000000454 32134 1727204447.09038: variable 'ansible_search_path' from source: unknown 32134 1727204447.09042: calling self._execute() 32134 1727204447.09119: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204447.09137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204447.09158: variable 'omit' from source: magic vars 32134 1727204447.09624: variable 'ansible_distribution_major_version' from source: facts 32134 1727204447.09689: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204447.09694: variable 'omit' from source: magic vars 32134 1727204447.09697: variable 'omit' from source: magic vars 32134 1727204447.09742: variable 'omit' from source: magic vars 32134 1727204447.09797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204447.09845: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204447.09876: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204447.09909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204447.09929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204447.10015: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204447.10019: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204447.10021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204447.10147: Set connection var ansible_timeout to 10 32134 1727204447.10159: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204447.10162: Set connection var ansible_connection to ssh 32134 1727204447.10165: Set connection var ansible_shell_type to sh 32134 1727204447.10172: Set connection var ansible_shell_executable to /bin/sh 32134 1727204447.10179: Set connection var ansible_pipelining to False 32134 1727204447.10204: variable 'ansible_shell_executable' from source: unknown 32134 1727204447.10208: variable 'ansible_connection' from source: unknown 32134 1727204447.10211: variable 'ansible_module_compression' from source: unknown 32134 1727204447.10216: variable 'ansible_shell_type' from source: unknown 32134 1727204447.10219: variable 'ansible_shell_executable' from source: unknown 32134 1727204447.10221: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204447.10224: variable 'ansible_pipelining' from source: unknown 32134 1727204447.10228: variable 'ansible_timeout' from source: unknown 32134 1727204447.10231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204447.10391: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204447.10402: variable 'omit' from source: magic vars 32134 1727204447.10408: starting attempt loop 32134 1727204447.10411: running the handler 32134 1727204447.10428: variable 'ansible_facts' from source: unknown 32134 1727204447.10446: _low_level_execute_command(): starting 32134 1727204447.10454: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204447.10958: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204447.10998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204447.11002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204447.11005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204447.11018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204447.11071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204447.11076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204447.11139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204447.12910: stdout chunk (state=3): >>>/root <<< 32134 1727204447.13020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204447.13077: stderr chunk (state=3): >>><<< 32134 1727204447.13079: stdout chunk (state=3): >>><<< 32134 1727204447.13106: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204447.13121: _low_level_execute_command(): starting 32134 1727204447.13127: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204447.1310344-33437-78020060903021 `" && echo ansible-tmp-1727204447.1310344-33437-78020060903021="` echo /root/.ansible/tmp/ansible-tmp-1727204447.1310344-33437-78020060903021 `" ) && sleep 0' 32134 1727204447.13572: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204447.13576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204447.13578: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204447.13588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204447.13591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204447.13648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204447.13650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204447.13682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204447.15667: stdout chunk (state=3): >>>ansible-tmp-1727204447.1310344-33437-78020060903021=/root/.ansible/tmp/ansible-tmp-1727204447.1310344-33437-78020060903021 <<< 32134 1727204447.15815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204447.15873: stderr chunk (state=3): >>><<< 32134 1727204447.15888: stdout chunk (state=3): >>><<< 32134 1727204447.15923: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204447.1310344-33437-78020060903021=/root/.ansible/tmp/ansible-tmp-1727204447.1310344-33437-78020060903021 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204447.16095: variable 'ansible_module_compression' from source: unknown 32134 1727204447.16098: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 32134 1727204447.16101: variable 'ansible_facts' from source: unknown 32134 1727204447.16299: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204447.1310344-33437-78020060903021/AnsiballZ_setup.py 32134 1727204447.16573: Sending initial data 32134 1727204447.16576: Sent initial data (153 bytes) 32134 1727204447.17092: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204447.17109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204447.17129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204447.17150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204447.17259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204447.17284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204447.17350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204447.18997: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 32134 1727204447.19036: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204447.19064: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204447.19139: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmps6_05hpu /root/.ansible/tmp/ansible-tmp-1727204447.1310344-33437-78020060903021/AnsiballZ_setup.py <<< 32134 1727204447.19144: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204447.1310344-33437-78020060903021/AnsiballZ_setup.py" <<< 32134 1727204447.19238: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmps6_05hpu" to remote "/root/.ansible/tmp/ansible-tmp-1727204447.1310344-33437-78020060903021/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204447.1310344-33437-78020060903021/AnsiballZ_setup.py" <<< 32134 1727204447.21609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204447.21654: stderr chunk (state=3): >>><<< 32134 1727204447.21668: stdout chunk (state=3): >>><<< 32134 1727204447.21706: done transferring module to remote 32134 1727204447.21733: _low_level_execute_command(): starting 32134 1727204447.21745: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204447.1310344-33437-78020060903021/ /root/.ansible/tmp/ansible-tmp-1727204447.1310344-33437-78020060903021/AnsiballZ_setup.py && sleep 0' 32134 1727204447.22420: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204447.22434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204447.22460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204447.22481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204447.22576: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204447.22615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204447.22637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204447.22668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204447.22735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204447.24730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204447.24740: stdout chunk (state=3): >>><<< 32134 1727204447.24757: stderr chunk (state=3): >>><<< 32134 1727204447.24796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204447.24799: _low_level_execute_command(): starting 32134 1727204447.24893: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204447.1310344-33437-78020060903021/AnsiballZ_setup.py && sleep 0' 32134 1727204447.25507: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204447.25537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204447.25556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204447.25577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204447.25651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204447.97985: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0"<<< 32134 1727204447.98014: stdout chunk (state=3): >>>: true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.64453125, "5m": 0.68212890625, "15m": 0.46630859375}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "47", "epoch": "1727204447", "epoch_int": "1727204447", "date": "2024-09-24", "time": "15:00:47", "iso8601_micro": "2024-09-24T19:00:47.574871Z", "iso8601": "2024-09-24T19:00:47Z", "iso8601_basic": "20240924T150047574871", "iso8601_basic_short": "20240924T150047", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_hostnqn": "", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2834, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 883, "free": 2834}, "nocache": {"free": 3472, "used": 245}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 951, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251144699904, "block_size": 4096, "block_total": 64479564, "block_available": 61314624, "block_used": 3164940, "inode_<<< 32134 1727204447.98065: stdout chunk (state=3): >>>total": 16384000, "inode_available": 16302235, "inode_used": 81765, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "ethtest0", "lo", "peerethtest0"], "ansible_ethtest0": {"device": "ethtest0", "macaddress": "56:10:7a:3f:31:3a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::928c:aece:b50f:aeb4", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off<<< 32134 1727204447.98070: stdout chunk (state=3): >>> [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ea:ec:31:9c:5f:fe", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e8ec:31ff:fe9c:5ffe", "prefix": "64", "scope": "link"}], "features": {"rx<<< 32134 1727204447.98084: stdout chunk (state=3): >>>_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::928c:aece:b50f:aeb4", "fe80::4a44:1e77:128f:34e8", "fe80::e8ec:31ff:fe9c:5ffe"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8", "fe80::928c:aece:b50f:aeb4", "fe80::e8ec:31ff:fe9c:5ffe"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 32134 1727204448.00130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204448.00200: stderr chunk (state=3): >>><<< 32134 1727204448.00203: stdout chunk (state=3): >>><<< 32134 1727204448.00250: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.64453125, "5m": 0.68212890625, "15m": 0.46630859375}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "47", "epoch": "1727204447", "epoch_int": "1727204447", "date": "2024-09-24", "time": "15:00:47", "iso8601_micro": "2024-09-24T19:00:47.574871Z", "iso8601": "2024-09-24T19:00:47Z", "iso8601_basic": "20240924T150047574871", "iso8601_basic_short": "20240924T150047", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_hostnqn": "", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2834, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 883, "free": 2834}, "nocache": {"free": 3472, "used": 245}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 951, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251144699904, "block_size": 4096, "block_total": 64479564, "block_available": 61314624, "block_used": 3164940, "inode_total": 16384000, "inode_available": 16302235, "inode_used": 81765, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "ethtest0", "lo", "peerethtest0"], "ansible_ethtest0": {"device": "ethtest0", "macaddress": "56:10:7a:3f:31:3a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::928c:aece:b50f:aeb4", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ea:ec:31:9c:5f:fe", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e8ec:31ff:fe9c:5ffe", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::928c:aece:b50f:aeb4", "fe80::4a44:1e77:128f:34e8", "fe80::e8ec:31ff:fe9c:5ffe"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8", "fe80::928c:aece:b50f:aeb4", "fe80::e8ec:31ff:fe9c:5ffe"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204448.00679: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204447.1310344-33437-78020060903021/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204448.00700: _low_level_execute_command(): starting 32134 1727204448.00705: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204447.1310344-33437-78020060903021/ > /dev/null 2>&1 && sleep 0' 32134 1727204448.01196: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204448.01200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204448.01202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204448.01205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204448.01220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204448.01269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204448.01275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204448.01278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204448.01318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204448.03297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204448.03352: stderr chunk (state=3): >>><<< 32134 1727204448.03356: stdout chunk (state=3): >>><<< 32134 1727204448.03371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204448.03381: handler run complete 32134 1727204448.03524: variable 'ansible_facts' from source: unknown 32134 1727204448.03631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204448.03978: variable 'ansible_facts' from source: unknown 32134 1727204448.04066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204448.04211: attempt loop complete, returning result 32134 1727204448.04217: _execute() done 32134 1727204448.04219: dumping result to json 32134 1727204448.04253: done dumping result, returning 32134 1727204448.04260: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-753f-5162-000000000454] 32134 1727204448.04265: sending task result for task 12b410aa-8751-753f-5162-000000000454 ok: [managed-node2] 32134 1727204448.05063: no more pending results, returning what we have 32134 1727204448.05066: results queue empty 32134 1727204448.05066: checking for any_errors_fatal 32134 1727204448.05067: done checking for any_errors_fatal 32134 1727204448.05068: checking for max_fail_percentage 32134 1727204448.05069: done checking for max_fail_percentage 32134 1727204448.05070: checking to see if all hosts have failed and the running result is not ok 32134 1727204448.05070: done checking to see if all hosts have failed 32134 1727204448.05071: getting the remaining hosts for this loop 32134 1727204448.05072: done getting the remaining hosts for this loop 32134 1727204448.05075: getting the next task for host managed-node2 32134 1727204448.05079: done getting next task for host managed-node2 32134 1727204448.05080: ^ task is: TASK: meta (flush_handlers) 32134 1727204448.05082: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204448.05085: getting variables 32134 1727204448.05086: in VariableManager get_vars() 32134 1727204448.05112: Calling all_inventory to load vars for managed-node2 32134 1727204448.05115: Calling groups_inventory to load vars for managed-node2 32134 1727204448.05116: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204448.05127: Calling all_plugins_play to load vars for managed-node2 32134 1727204448.05130: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204448.05134: Calling groups_plugins_play to load vars for managed-node2 32134 1727204448.05653: done sending task result for task 12b410aa-8751-753f-5162-000000000454 32134 1727204448.05657: WORKER PROCESS EXITING 32134 1727204448.06497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204448.08665: done with get_vars() 32134 1727204448.08688: done getting variables 32134 1727204448.08750: in VariableManager get_vars() 32134 1727204448.08763: Calling all_inventory to load vars for managed-node2 32134 1727204448.08766: Calling groups_inventory to load vars for managed-node2 32134 1727204448.08769: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204448.08774: Calling all_plugins_play to load vars for managed-node2 32134 1727204448.08776: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204448.08778: Calling groups_plugins_play to load vars for managed-node2 32134 1727204448.09948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204448.12395: done with get_vars() 32134 1727204448.12440: done queuing things up, now waiting for results queue to drain 32134 1727204448.12443: results queue empty 32134 1727204448.12444: checking for any_errors_fatal 32134 1727204448.12449: done checking for any_errors_fatal 32134 1727204448.12450: checking for max_fail_percentage 32134 1727204448.12452: done checking for max_fail_percentage 32134 1727204448.12457: checking to see if all hosts have failed and the running result is not ok 32134 1727204448.12458: done checking to see if all hosts have failed 32134 1727204448.12459: getting the remaining hosts for this loop 32134 1727204448.12460: done getting the remaining hosts for this loop 32134 1727204448.12464: getting the next task for host managed-node2 32134 1727204448.12469: done getting next task for host managed-node2 32134 1727204448.12473: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 32134 1727204448.12475: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204448.12486: getting variables 32134 1727204448.12488: in VariableManager get_vars() 32134 1727204448.12507: Calling all_inventory to load vars for managed-node2 32134 1727204448.12509: Calling groups_inventory to load vars for managed-node2 32134 1727204448.12515: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204448.12521: Calling all_plugins_play to load vars for managed-node2 32134 1727204448.12524: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204448.12528: Calling groups_plugins_play to load vars for managed-node2 32134 1727204448.14599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204448.17603: done with get_vars() 32134 1727204448.17640: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:00:48 -0400 (0:00:01.096) 0:00:22.581 ***** 32134 1727204448.17741: entering _queue_task() for managed-node2/include_tasks 32134 1727204448.18133: worker is 1 (out of 1 available) 32134 1727204448.18149: exiting _queue_task() for managed-node2/include_tasks 32134 1727204448.18165: done queuing things up, now waiting for results queue to drain 32134 1727204448.18167: waiting for pending results... 32134 1727204448.18511: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 32134 1727204448.18632: in run() - task 12b410aa-8751-753f-5162-000000000067 32134 1727204448.18663: variable 'ansible_search_path' from source: unknown 32134 1727204448.18744: variable 'ansible_search_path' from source: unknown 32134 1727204448.18753: calling self._execute() 32134 1727204448.18876: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204448.18895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204448.18913: variable 'omit' from source: magic vars 32134 1727204448.19502: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.19546: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.19712: variable 'connection_failed' from source: set_fact 32134 1727204448.19796: Evaluated conditional (not connection_failed): True 32134 1727204448.19895: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.19910: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.20045: variable 'connection_failed' from source: set_fact 32134 1727204448.20069: Evaluated conditional (not connection_failed): True 32134 1727204448.20082: _execute() done 32134 1727204448.20093: dumping result to json 32134 1727204448.20101: done dumping result, returning 32134 1727204448.20113: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-753f-5162-000000000067] 32134 1727204448.20123: sending task result for task 12b410aa-8751-753f-5162-000000000067 32134 1727204448.20263: done sending task result for task 12b410aa-8751-753f-5162-000000000067 32134 1727204448.20269: WORKER PROCESS EXITING 32134 1727204448.20322: no more pending results, returning what we have 32134 1727204448.20329: in VariableManager get_vars() 32134 1727204448.20378: Calling all_inventory to load vars for managed-node2 32134 1727204448.20382: Calling groups_inventory to load vars for managed-node2 32134 1727204448.20385: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204448.20404: Calling all_plugins_play to load vars for managed-node2 32134 1727204448.20409: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204448.20413: Calling groups_plugins_play to load vars for managed-node2 32134 1727204448.21911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204448.24176: done with get_vars() 32134 1727204448.24199: variable 'ansible_search_path' from source: unknown 32134 1727204448.24200: variable 'ansible_search_path' from source: unknown 32134 1727204448.24224: we have included files to process 32134 1727204448.24225: generating all_blocks data 32134 1727204448.24226: done generating all_blocks data 32134 1727204448.24227: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32134 1727204448.24228: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32134 1727204448.24229: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32134 1727204448.24721: done processing included file 32134 1727204448.24723: iterating over new_blocks loaded from include file 32134 1727204448.24724: in VariableManager get_vars() 32134 1727204448.24740: done with get_vars() 32134 1727204448.24742: filtering new block on tags 32134 1727204448.24755: done filtering new block on tags 32134 1727204448.24757: in VariableManager get_vars() 32134 1727204448.24771: done with get_vars() 32134 1727204448.24772: filtering new block on tags 32134 1727204448.24788: done filtering new block on tags 32134 1727204448.24791: in VariableManager get_vars() 32134 1727204448.24808: done with get_vars() 32134 1727204448.24809: filtering new block on tags 32134 1727204448.24824: done filtering new block on tags 32134 1727204448.24826: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 32134 1727204448.24830: extending task lists for all hosts with included blocks 32134 1727204448.25119: done extending task lists 32134 1727204448.25120: done processing included files 32134 1727204448.25121: results queue empty 32134 1727204448.25122: checking for any_errors_fatal 32134 1727204448.25123: done checking for any_errors_fatal 32134 1727204448.25124: checking for max_fail_percentage 32134 1727204448.25125: done checking for max_fail_percentage 32134 1727204448.25125: checking to see if all hosts have failed and the running result is not ok 32134 1727204448.25126: done checking to see if all hosts have failed 32134 1727204448.25126: getting the remaining hosts for this loop 32134 1727204448.25127: done getting the remaining hosts for this loop 32134 1727204448.25129: getting the next task for host managed-node2 32134 1727204448.25132: done getting next task for host managed-node2 32134 1727204448.25134: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 32134 1727204448.25136: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204448.25144: getting variables 32134 1727204448.25145: in VariableManager get_vars() 32134 1727204448.25155: Calling all_inventory to load vars for managed-node2 32134 1727204448.25157: Calling groups_inventory to load vars for managed-node2 32134 1727204448.25159: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204448.25163: Calling all_plugins_play to load vars for managed-node2 32134 1727204448.25164: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204448.25167: Calling groups_plugins_play to load vars for managed-node2 32134 1727204448.26835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204448.28620: done with get_vars() 32134 1727204448.28641: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:00:48 -0400 (0:00:00.109) 0:00:22.690 ***** 32134 1727204448.28701: entering _queue_task() for managed-node2/setup 32134 1727204448.29043: worker is 1 (out of 1 available) 32134 1727204448.29058: exiting _queue_task() for managed-node2/setup 32134 1727204448.29073: done queuing things up, now waiting for results queue to drain 32134 1727204448.29075: waiting for pending results... 32134 1727204448.29524: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 32134 1727204448.29531: in run() - task 12b410aa-8751-753f-5162-000000000495 32134 1727204448.29536: variable 'ansible_search_path' from source: unknown 32134 1727204448.29539: variable 'ansible_search_path' from source: unknown 32134 1727204448.29621: calling self._execute() 32134 1727204448.29696: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204448.29711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204448.29736: variable 'omit' from source: magic vars 32134 1727204448.30180: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.30201: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.30348: variable 'connection_failed' from source: set_fact 32134 1727204448.30361: Evaluated conditional (not connection_failed): True 32134 1727204448.30509: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.30695: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.30698: variable 'connection_failed' from source: set_fact 32134 1727204448.30701: Evaluated conditional (not connection_failed): True 32134 1727204448.30809: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.30823: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.30951: variable 'connection_failed' from source: set_fact 32134 1727204448.30964: Evaluated conditional (not connection_failed): True 32134 1727204448.31108: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.31124: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.31253: variable 'connection_failed' from source: set_fact 32134 1727204448.31258: Evaluated conditional (not connection_failed): True 32134 1727204448.31450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204448.33495: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204448.33499: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204448.33502: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204448.33505: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204448.33507: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204448.33582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204448.33629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204448.33666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204448.33733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204448.33756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204448.33837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204448.33877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204448.33918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204448.33976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204448.34010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204448.34166: variable '__network_required_facts' from source: role '' defaults 32134 1727204448.34174: variable 'ansible_facts' from source: unknown 32134 1727204448.34894: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 32134 1727204448.34897: when evaluation is False, skipping this task 32134 1727204448.34900: _execute() done 32134 1727204448.34903: dumping result to json 32134 1727204448.34908: done dumping result, returning 32134 1727204448.34919: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-753f-5162-000000000495] 32134 1727204448.34924: sending task result for task 12b410aa-8751-753f-5162-000000000495 32134 1727204448.35025: done sending task result for task 12b410aa-8751-753f-5162-000000000495 32134 1727204448.35028: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32134 1727204448.35078: no more pending results, returning what we have 32134 1727204448.35083: results queue empty 32134 1727204448.35084: checking for any_errors_fatal 32134 1727204448.35086: done checking for any_errors_fatal 32134 1727204448.35087: checking for max_fail_percentage 32134 1727204448.35090: done checking for max_fail_percentage 32134 1727204448.35091: checking to see if all hosts have failed and the running result is not ok 32134 1727204448.35092: done checking to see if all hosts have failed 32134 1727204448.35093: getting the remaining hosts for this loop 32134 1727204448.35095: done getting the remaining hosts for this loop 32134 1727204448.35099: getting the next task for host managed-node2 32134 1727204448.35109: done getting next task for host managed-node2 32134 1727204448.35114: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 32134 1727204448.35117: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204448.35132: getting variables 32134 1727204448.35134: in VariableManager get_vars() 32134 1727204448.35177: Calling all_inventory to load vars for managed-node2 32134 1727204448.35181: Calling groups_inventory to load vars for managed-node2 32134 1727204448.35197: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204448.35208: Calling all_plugins_play to load vars for managed-node2 32134 1727204448.35211: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204448.35214: Calling groups_plugins_play to load vars for managed-node2 32134 1727204448.36487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204448.38091: done with get_vars() 32134 1727204448.38114: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:00:48 -0400 (0:00:00.094) 0:00:22.785 ***** 32134 1727204448.38196: entering _queue_task() for managed-node2/stat 32134 1727204448.38444: worker is 1 (out of 1 available) 32134 1727204448.38460: exiting _queue_task() for managed-node2/stat 32134 1727204448.38473: done queuing things up, now waiting for results queue to drain 32134 1727204448.38475: waiting for pending results... 32134 1727204448.38667: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 32134 1727204448.38778: in run() - task 12b410aa-8751-753f-5162-000000000497 32134 1727204448.38795: variable 'ansible_search_path' from source: unknown 32134 1727204448.38798: variable 'ansible_search_path' from source: unknown 32134 1727204448.38835: calling self._execute() 32134 1727204448.38913: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204448.38923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204448.38937: variable 'omit' from source: magic vars 32134 1727204448.39251: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.39264: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.39360: variable 'connection_failed' from source: set_fact 32134 1727204448.39365: Evaluated conditional (not connection_failed): True 32134 1727204448.39464: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.39467: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.39554: variable 'connection_failed' from source: set_fact 32134 1727204448.39557: Evaluated conditional (not connection_failed): True 32134 1727204448.39655: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.39659: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.39747: variable 'connection_failed' from source: set_fact 32134 1727204448.39751: Evaluated conditional (not connection_failed): True 32134 1727204448.39846: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.39850: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.39938: variable 'connection_failed' from source: set_fact 32134 1727204448.39942: Evaluated conditional (not connection_failed): True 32134 1727204448.40078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204448.40302: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204448.40342: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204448.40375: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204448.40406: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204448.40756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204448.40778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204448.40813: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204448.40835: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204448.40907: variable '__network_is_ostree' from source: set_fact 32134 1727204448.40916: Evaluated conditional (not __network_is_ostree is defined): False 32134 1727204448.40920: when evaluation is False, skipping this task 32134 1727204448.40922: _execute() done 32134 1727204448.40925: dumping result to json 32134 1727204448.40931: done dumping result, returning 32134 1727204448.40938: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-753f-5162-000000000497] 32134 1727204448.40943: sending task result for task 12b410aa-8751-753f-5162-000000000497 32134 1727204448.41038: done sending task result for task 12b410aa-8751-753f-5162-000000000497 32134 1727204448.41041: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 32134 1727204448.41097: no more pending results, returning what we have 32134 1727204448.41102: results queue empty 32134 1727204448.41103: checking for any_errors_fatal 32134 1727204448.41115: done checking for any_errors_fatal 32134 1727204448.41116: checking for max_fail_percentage 32134 1727204448.41118: done checking for max_fail_percentage 32134 1727204448.41118: checking to see if all hosts have failed and the running result is not ok 32134 1727204448.41120: done checking to see if all hosts have failed 32134 1727204448.41120: getting the remaining hosts for this loop 32134 1727204448.41122: done getting the remaining hosts for this loop 32134 1727204448.41126: getting the next task for host managed-node2 32134 1727204448.41132: done getting next task for host managed-node2 32134 1727204448.41136: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 32134 1727204448.41139: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204448.41155: getting variables 32134 1727204448.41156: in VariableManager get_vars() 32134 1727204448.41194: Calling all_inventory to load vars for managed-node2 32134 1727204448.41197: Calling groups_inventory to load vars for managed-node2 32134 1727204448.41200: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204448.41209: Calling all_plugins_play to load vars for managed-node2 32134 1727204448.41214: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204448.41218: Calling groups_plugins_play to load vars for managed-node2 32134 1727204448.42537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204448.44139: done with get_vars() 32134 1727204448.44160: done getting variables 32134 1727204448.44217: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:00:48 -0400 (0:00:00.060) 0:00:22.846 ***** 32134 1727204448.44244: entering _queue_task() for managed-node2/set_fact 32134 1727204448.44484: worker is 1 (out of 1 available) 32134 1727204448.44501: exiting _queue_task() for managed-node2/set_fact 32134 1727204448.44516: done queuing things up, now waiting for results queue to drain 32134 1727204448.44518: waiting for pending results... 32134 1727204448.44699: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 32134 1727204448.44808: in run() - task 12b410aa-8751-753f-5162-000000000498 32134 1727204448.44820: variable 'ansible_search_path' from source: unknown 32134 1727204448.44823: variable 'ansible_search_path' from source: unknown 32134 1727204448.44858: calling self._execute() 32134 1727204448.44938: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204448.44945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204448.44957: variable 'omit' from source: magic vars 32134 1727204448.45266: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.45276: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.45372: variable 'connection_failed' from source: set_fact 32134 1727204448.45376: Evaluated conditional (not connection_failed): True 32134 1727204448.45471: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.45476: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.45563: variable 'connection_failed' from source: set_fact 32134 1727204448.45567: Evaluated conditional (not connection_failed): True 32134 1727204448.45664: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.45669: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.45751: variable 'connection_failed' from source: set_fact 32134 1727204448.45755: Evaluated conditional (not connection_failed): True 32134 1727204448.45848: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.45853: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.45931: variable 'connection_failed' from source: set_fact 32134 1727204448.45936: Evaluated conditional (not connection_failed): True 32134 1727204448.46075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204448.46285: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204448.46327: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204448.46356: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204448.46387: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204448.46487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204448.46516: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204448.46537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204448.46559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204448.46631: variable '__network_is_ostree' from source: set_fact 32134 1727204448.46638: Evaluated conditional (not __network_is_ostree is defined): False 32134 1727204448.46641: when evaluation is False, skipping this task 32134 1727204448.46646: _execute() done 32134 1727204448.46649: dumping result to json 32134 1727204448.46654: done dumping result, returning 32134 1727204448.46662: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-753f-5162-000000000498] 32134 1727204448.46667: sending task result for task 12b410aa-8751-753f-5162-000000000498 32134 1727204448.46761: done sending task result for task 12b410aa-8751-753f-5162-000000000498 32134 1727204448.46764: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 32134 1727204448.46824: no more pending results, returning what we have 32134 1727204448.46828: results queue empty 32134 1727204448.46829: checking for any_errors_fatal 32134 1727204448.46835: done checking for any_errors_fatal 32134 1727204448.46836: checking for max_fail_percentage 32134 1727204448.46838: done checking for max_fail_percentage 32134 1727204448.46839: checking to see if all hosts have failed and the running result is not ok 32134 1727204448.46840: done checking to see if all hosts have failed 32134 1727204448.46841: getting the remaining hosts for this loop 32134 1727204448.46843: done getting the remaining hosts for this loop 32134 1727204448.46847: getting the next task for host managed-node2 32134 1727204448.46856: done getting next task for host managed-node2 32134 1727204448.46860: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 32134 1727204448.46863: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204448.46881: getting variables 32134 1727204448.46882: in VariableManager get_vars() 32134 1727204448.46921: Calling all_inventory to load vars for managed-node2 32134 1727204448.46924: Calling groups_inventory to load vars for managed-node2 32134 1727204448.46926: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204448.46936: Calling all_plugins_play to load vars for managed-node2 32134 1727204448.46939: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204448.46942: Calling groups_plugins_play to load vars for managed-node2 32134 1727204448.51539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204448.53174: done with get_vars() 32134 1727204448.53204: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:00:48 -0400 (0:00:00.090) 0:00:22.936 ***** 32134 1727204448.53270: entering _queue_task() for managed-node2/service_facts 32134 1727204448.53549: worker is 1 (out of 1 available) 32134 1727204448.53564: exiting _queue_task() for managed-node2/service_facts 32134 1727204448.53575: done queuing things up, now waiting for results queue to drain 32134 1727204448.53577: waiting for pending results... 32134 1727204448.53782: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 32134 1727204448.53893: in run() - task 12b410aa-8751-753f-5162-00000000049a 32134 1727204448.53906: variable 'ansible_search_path' from source: unknown 32134 1727204448.53912: variable 'ansible_search_path' from source: unknown 32134 1727204448.53947: calling self._execute() 32134 1727204448.54032: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204448.54037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204448.54050: variable 'omit' from source: magic vars 32134 1727204448.54380: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.54393: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.54492: variable 'connection_failed' from source: set_fact 32134 1727204448.54498: Evaluated conditional (not connection_failed): True 32134 1727204448.54617: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.54622: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.54796: variable 'connection_failed' from source: set_fact 32134 1727204448.54800: Evaluated conditional (not connection_failed): True 32134 1727204448.54903: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.54925: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.55059: variable 'connection_failed' from source: set_fact 32134 1727204448.55071: Evaluated conditional (not connection_failed): True 32134 1727204448.55221: variable 'ansible_distribution_major_version' from source: facts 32134 1727204448.55250: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204448.55376: variable 'connection_failed' from source: set_fact 32134 1727204448.55596: Evaluated conditional (not connection_failed): True 32134 1727204448.55599: variable 'omit' from source: magic vars 32134 1727204448.55602: variable 'omit' from source: magic vars 32134 1727204448.55605: variable 'omit' from source: magic vars 32134 1727204448.55607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204448.55622: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204448.55653: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204448.55680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204448.55700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204448.55753: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204448.55763: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204448.55772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204448.55923: Set connection var ansible_timeout to 10 32134 1727204448.55961: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204448.55970: Set connection var ansible_connection to ssh 32134 1727204448.55977: Set connection var ansible_shell_type to sh 32134 1727204448.55993: Set connection var ansible_shell_executable to /bin/sh 32134 1727204448.56052: Set connection var ansible_pipelining to False 32134 1727204448.56056: variable 'ansible_shell_executable' from source: unknown 32134 1727204448.56060: variable 'ansible_connection' from source: unknown 32134 1727204448.56063: variable 'ansible_module_compression' from source: unknown 32134 1727204448.56074: variable 'ansible_shell_type' from source: unknown 32134 1727204448.56085: variable 'ansible_shell_executable' from source: unknown 32134 1727204448.56096: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204448.56162: variable 'ansible_pipelining' from source: unknown 32134 1727204448.56166: variable 'ansible_timeout' from source: unknown 32134 1727204448.56168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204448.56434: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204448.56461: variable 'omit' from source: magic vars 32134 1727204448.56474: starting attempt loop 32134 1727204448.56491: running the handler 32134 1727204448.56523: _low_level_execute_command(): starting 32134 1727204448.56539: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204448.57480: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204448.57484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204448.57737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204448.57754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204448.59563: stdout chunk (state=3): >>>/root <<< 32134 1727204448.59887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204448.59894: stdout chunk (state=3): >>><<< 32134 1727204448.59897: stderr chunk (state=3): >>><<< 32134 1727204448.59900: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204448.59903: _low_level_execute_command(): starting 32134 1727204448.59917: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204448.5984247-33478-242519462969903 `" && echo ansible-tmp-1727204448.5984247-33478-242519462969903="` echo /root/.ansible/tmp/ansible-tmp-1727204448.5984247-33478-242519462969903 `" ) && sleep 0' 32134 1727204448.60860: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204448.60864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204448.60873: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204448.60878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204448.60880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204448.60927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204448.60950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204448.61023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204448.63054: stdout chunk (state=3): >>>ansible-tmp-1727204448.5984247-33478-242519462969903=/root/.ansible/tmp/ansible-tmp-1727204448.5984247-33478-242519462969903 <<< 32134 1727204448.63300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204448.63303: stdout chunk (state=3): >>><<< 32134 1727204448.63306: stderr chunk (state=3): >>><<< 32134 1727204448.63308: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204448.5984247-33478-242519462969903=/root/.ansible/tmp/ansible-tmp-1727204448.5984247-33478-242519462969903 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204448.63332: variable 'ansible_module_compression' from source: unknown 32134 1727204448.63382: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 32134 1727204448.63441: variable 'ansible_facts' from source: unknown 32134 1727204448.63526: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204448.5984247-33478-242519462969903/AnsiballZ_service_facts.py 32134 1727204448.63768: Sending initial data 32134 1727204448.63784: Sent initial data (162 bytes) 32134 1727204448.64344: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204448.64407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204448.64468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204448.64488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204448.64545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204448.64583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204448.66244: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204448.66296: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204448.66337: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpxqgsty36 /root/.ansible/tmp/ansible-tmp-1727204448.5984247-33478-242519462969903/AnsiballZ_service_facts.py <<< 32134 1727204448.66341: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204448.5984247-33478-242519462969903/AnsiballZ_service_facts.py" <<< 32134 1727204448.66371: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpxqgsty36" to remote "/root/.ansible/tmp/ansible-tmp-1727204448.5984247-33478-242519462969903/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204448.5984247-33478-242519462969903/AnsiballZ_service_facts.py" <<< 32134 1727204448.67485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204448.67548: stderr chunk (state=3): >>><<< 32134 1727204448.67565: stdout chunk (state=3): >>><<< 32134 1727204448.67601: done transferring module to remote 32134 1727204448.67696: _low_level_execute_command(): starting 32134 1727204448.67700: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204448.5984247-33478-242519462969903/ /root/.ansible/tmp/ansible-tmp-1727204448.5984247-33478-242519462969903/AnsiballZ_service_facts.py && sleep 0' 32134 1727204448.68344: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204448.68409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204448.68493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204448.68521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204448.68543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204448.68562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204448.68647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204448.70615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204448.70655: stderr chunk (state=3): >>><<< 32134 1727204448.70669: stdout chunk (state=3): >>><<< 32134 1727204448.70693: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204448.70707: _low_level_execute_command(): starting 32134 1727204448.70803: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204448.5984247-33478-242519462969903/AnsiballZ_service_facts.py && sleep 0' 32134 1727204448.71510: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204448.71539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204448.71619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204450.75887: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"n<<< 32134 1727204450.75927: stdout chunk (state=3): >>>ame": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service"<<< 32134 1727204450.75937: stdout chunk (state=3): >>>, "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inact<<< 32134 1727204450.75965: stdout chunk (state=3): >>>ive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd<<< 32134 1727204450.75991: stdout chunk (state=3): >>>"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 32134 1727204450.77676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204450.77737: stderr chunk (state=3): >>><<< 32134 1727204450.77741: stdout chunk (state=3): >>><<< 32134 1727204450.77778: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204450.78449: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204448.5984247-33478-242519462969903/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204450.78458: _low_level_execute_command(): starting 32134 1727204450.78464: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204448.5984247-33478-242519462969903/ > /dev/null 2>&1 && sleep 0' 32134 1727204450.78949: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204450.78953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204450.78955: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204450.78958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204450.79017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204450.79024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204450.79066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204450.81056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204450.81120: stderr chunk (state=3): >>><<< 32134 1727204450.81124: stdout chunk (state=3): >>><<< 32134 1727204450.81133: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204450.81143: handler run complete 32134 1727204450.81313: variable 'ansible_facts' from source: unknown 32134 1727204450.81462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204450.81906: variable 'ansible_facts' from source: unknown 32134 1727204450.82032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204450.82230: attempt loop complete, returning result 32134 1727204450.82236: _execute() done 32134 1727204450.82239: dumping result to json 32134 1727204450.82284: done dumping result, returning 32134 1727204450.82295: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-753f-5162-00000000049a] 32134 1727204450.82301: sending task result for task 12b410aa-8751-753f-5162-00000000049a ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32134 1727204450.83131: done sending task result for task 12b410aa-8751-753f-5162-00000000049a 32134 1727204450.83134: WORKER PROCESS EXITING 32134 1727204450.83141: no more pending results, returning what we have 32134 1727204450.83144: results queue empty 32134 1727204450.83145: checking for any_errors_fatal 32134 1727204450.83150: done checking for any_errors_fatal 32134 1727204450.83150: checking for max_fail_percentage 32134 1727204450.83152: done checking for max_fail_percentage 32134 1727204450.83152: checking to see if all hosts have failed and the running result is not ok 32134 1727204450.83153: done checking to see if all hosts have failed 32134 1727204450.83153: getting the remaining hosts for this loop 32134 1727204450.83154: done getting the remaining hosts for this loop 32134 1727204450.83157: getting the next task for host managed-node2 32134 1727204450.83161: done getting next task for host managed-node2 32134 1727204450.83164: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 32134 1727204450.83166: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204450.83174: getting variables 32134 1727204450.83175: in VariableManager get_vars() 32134 1727204450.83202: Calling all_inventory to load vars for managed-node2 32134 1727204450.83205: Calling groups_inventory to load vars for managed-node2 32134 1727204450.83207: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204450.83215: Calling all_plugins_play to load vars for managed-node2 32134 1727204450.83217: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204450.83219: Calling groups_plugins_play to load vars for managed-node2 32134 1727204450.84484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204450.86098: done with get_vars() 32134 1727204450.86122: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:00:50 -0400 (0:00:02.329) 0:00:25.265 ***** 32134 1727204450.86203: entering _queue_task() for managed-node2/package_facts 32134 1727204450.86459: worker is 1 (out of 1 available) 32134 1727204450.86474: exiting _queue_task() for managed-node2/package_facts 32134 1727204450.86488: done queuing things up, now waiting for results queue to drain 32134 1727204450.86491: waiting for pending results... 32134 1727204450.86688: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 32134 1727204450.86790: in run() - task 12b410aa-8751-753f-5162-00000000049b 32134 1727204450.86805: variable 'ansible_search_path' from source: unknown 32134 1727204450.86809: variable 'ansible_search_path' from source: unknown 32134 1727204450.86844: calling self._execute() 32134 1727204450.86931: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204450.86942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204450.86996: variable 'omit' from source: magic vars 32134 1727204450.87283: variable 'ansible_distribution_major_version' from source: facts 32134 1727204450.87295: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204450.87388: variable 'connection_failed' from source: set_fact 32134 1727204450.87399: Evaluated conditional (not connection_failed): True 32134 1727204450.87492: variable 'ansible_distribution_major_version' from source: facts 32134 1727204450.87501: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204450.87583: variable 'connection_failed' from source: set_fact 32134 1727204450.87586: Evaluated conditional (not connection_failed): True 32134 1727204450.87681: variable 'ansible_distribution_major_version' from source: facts 32134 1727204450.87684: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204450.87774: variable 'connection_failed' from source: set_fact 32134 1727204450.87777: Evaluated conditional (not connection_failed): True 32134 1727204450.87871: variable 'ansible_distribution_major_version' from source: facts 32134 1727204450.87876: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204450.87965: variable 'connection_failed' from source: set_fact 32134 1727204450.87968: Evaluated conditional (not connection_failed): True 32134 1727204450.87976: variable 'omit' from source: magic vars 32134 1727204450.88024: variable 'omit' from source: magic vars 32134 1727204450.88058: variable 'omit' from source: magic vars 32134 1727204450.88093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204450.88125: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204450.88145: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204450.88164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204450.88177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204450.88206: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204450.88210: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204450.88217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204450.88304: Set connection var ansible_timeout to 10 32134 1727204450.88319: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204450.88323: Set connection var ansible_connection to ssh 32134 1727204450.88325: Set connection var ansible_shell_type to sh 32134 1727204450.88332: Set connection var ansible_shell_executable to /bin/sh 32134 1727204450.88338: Set connection var ansible_pipelining to False 32134 1727204450.88359: variable 'ansible_shell_executable' from source: unknown 32134 1727204450.88367: variable 'ansible_connection' from source: unknown 32134 1727204450.88374: variable 'ansible_module_compression' from source: unknown 32134 1727204450.88377: variable 'ansible_shell_type' from source: unknown 32134 1727204450.88379: variable 'ansible_shell_executable' from source: unknown 32134 1727204450.88382: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204450.88388: variable 'ansible_pipelining' from source: unknown 32134 1727204450.88392: variable 'ansible_timeout' from source: unknown 32134 1727204450.88398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204450.88566: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204450.88576: variable 'omit' from source: magic vars 32134 1727204450.88587: starting attempt loop 32134 1727204450.88592: running the handler 32134 1727204450.88609: _low_level_execute_command(): starting 32134 1727204450.88618: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204450.89164: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204450.89168: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204450.89226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204450.89234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204450.89282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204450.91059: stdout chunk (state=3): >>>/root <<< 32134 1727204450.91175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204450.91230: stderr chunk (state=3): >>><<< 32134 1727204450.91234: stdout chunk (state=3): >>><<< 32134 1727204450.91253: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204450.91268: _low_level_execute_command(): starting 32134 1727204450.91275: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204450.9125323-33521-67118594533252 `" && echo ansible-tmp-1727204450.9125323-33521-67118594533252="` echo /root/.ansible/tmp/ansible-tmp-1727204450.9125323-33521-67118594533252 `" ) && sleep 0' 32134 1727204450.91748: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204450.91752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204450.91755: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204450.91757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204450.91812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204450.91816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204450.91862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204450.93936: stdout chunk (state=3): >>>ansible-tmp-1727204450.9125323-33521-67118594533252=/root/.ansible/tmp/ansible-tmp-1727204450.9125323-33521-67118594533252 <<< 32134 1727204450.94056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204450.94106: stderr chunk (state=3): >>><<< 32134 1727204450.94110: stdout chunk (state=3): >>><<< 32134 1727204450.94127: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204450.9125323-33521-67118594533252=/root/.ansible/tmp/ansible-tmp-1727204450.9125323-33521-67118594533252 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204450.94170: variable 'ansible_module_compression' from source: unknown 32134 1727204450.94211: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 32134 1727204450.94269: variable 'ansible_facts' from source: unknown 32134 1727204450.94408: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204450.9125323-33521-67118594533252/AnsiballZ_package_facts.py 32134 1727204450.94535: Sending initial data 32134 1727204450.94539: Sent initial data (161 bytes) 32134 1727204450.95012: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204450.95016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204450.95019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204450.95023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204450.95026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204450.95075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204450.95079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204450.95123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204450.96816: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 32134 1727204450.96824: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204450.96850: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204450.96891: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpwouvrhxp /root/.ansible/tmp/ansible-tmp-1727204450.9125323-33521-67118594533252/AnsiballZ_package_facts.py <<< 32134 1727204450.96895: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204450.9125323-33521-67118594533252/AnsiballZ_package_facts.py" <<< 32134 1727204450.96927: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpwouvrhxp" to remote "/root/.ansible/tmp/ansible-tmp-1727204450.9125323-33521-67118594533252/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204450.9125323-33521-67118594533252/AnsiballZ_package_facts.py" <<< 32134 1727204450.98622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204450.98682: stderr chunk (state=3): >>><<< 32134 1727204450.98685: stdout chunk (state=3): >>><<< 32134 1727204450.98709: done transferring module to remote 32134 1727204450.98720: _low_level_execute_command(): starting 32134 1727204450.98730: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204450.9125323-33521-67118594533252/ /root/.ansible/tmp/ansible-tmp-1727204450.9125323-33521-67118594533252/AnsiballZ_package_facts.py && sleep 0' 32134 1727204450.99194: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204450.99198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204450.99201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204450.99203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204450.99259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204450.99264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204450.99304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204451.01267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204451.01315: stderr chunk (state=3): >>><<< 32134 1727204451.01321: stdout chunk (state=3): >>><<< 32134 1727204451.01336: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204451.01339: _low_level_execute_command(): starting 32134 1727204451.01345: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204450.9125323-33521-67118594533252/AnsiballZ_package_facts.py && sleep 0' 32134 1727204451.01789: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204451.01793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204451.01795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204451.01799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204451.01802: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204451.01857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204451.01860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204451.01909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204451.66374: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 32134 1727204451.66411: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 32134 1727204451.66431: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 32134 1727204451.66460: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 32134 1727204451.66485: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 32134 1727204451.66520: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": n<<< 32134 1727204451.66544: stdout chunk (state=3): >>>ull, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 32134 1727204451.66551: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 32134 1727204451.66591: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name"<<< 32134 1727204451.66629: stdout chunk (state=3): >>>: "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 32134 1727204451.66635: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 32134 1727204451.66638: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 32134 1727204451.68646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204451.68649: stdout chunk (state=3): >>><<< 32134 1727204451.68652: stderr chunk (state=3): >>><<< 32134 1727204451.68905: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204451.71638: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204450.9125323-33521-67118594533252/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204451.71660: _low_level_execute_command(): starting 32134 1727204451.71664: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204450.9125323-33521-67118594533252/ > /dev/null 2>&1 && sleep 0' 32134 1727204451.72136: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204451.72140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204451.72142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204451.72144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204451.72147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204451.72196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204451.72201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204451.72255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204451.74267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204451.74322: stderr chunk (state=3): >>><<< 32134 1727204451.74325: stdout chunk (state=3): >>><<< 32134 1727204451.74338: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204451.74347: handler run complete 32134 1727204451.75159: variable 'ansible_facts' from source: unknown 32134 1727204451.75613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204451.77639: variable 'ansible_facts' from source: unknown 32134 1727204451.78063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204451.78832: attempt loop complete, returning result 32134 1727204451.78850: _execute() done 32134 1727204451.78853: dumping result to json 32134 1727204451.79030: done dumping result, returning 32134 1727204451.79040: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-753f-5162-00000000049b] 32134 1727204451.79045: sending task result for task 12b410aa-8751-753f-5162-00000000049b 32134 1727204451.81082: done sending task result for task 12b410aa-8751-753f-5162-00000000049b 32134 1727204451.81085: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32134 1727204451.81187: no more pending results, returning what we have 32134 1727204451.81191: results queue empty 32134 1727204451.81192: checking for any_errors_fatal 32134 1727204451.81196: done checking for any_errors_fatal 32134 1727204451.81196: checking for max_fail_percentage 32134 1727204451.81197: done checking for max_fail_percentage 32134 1727204451.81198: checking to see if all hosts have failed and the running result is not ok 32134 1727204451.81199: done checking to see if all hosts have failed 32134 1727204451.81199: getting the remaining hosts for this loop 32134 1727204451.81200: done getting the remaining hosts for this loop 32134 1727204451.81203: getting the next task for host managed-node2 32134 1727204451.81208: done getting next task for host managed-node2 32134 1727204451.81211: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 32134 1727204451.81213: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204451.81220: getting variables 32134 1727204451.81221: in VariableManager get_vars() 32134 1727204451.81247: Calling all_inventory to load vars for managed-node2 32134 1727204451.81250: Calling groups_inventory to load vars for managed-node2 32134 1727204451.81253: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204451.81261: Calling all_plugins_play to load vars for managed-node2 32134 1727204451.81263: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204451.81266: Calling groups_plugins_play to load vars for managed-node2 32134 1727204451.82449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204451.84061: done with get_vars() 32134 1727204451.84090: done getting variables 32134 1727204451.84143: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:00:51 -0400 (0:00:00.979) 0:00:26.245 ***** 32134 1727204451.84168: entering _queue_task() for managed-node2/debug 32134 1727204451.84438: worker is 1 (out of 1 available) 32134 1727204451.84452: exiting _queue_task() for managed-node2/debug 32134 1727204451.84464: done queuing things up, now waiting for results queue to drain 32134 1727204451.84466: waiting for pending results... 32134 1727204451.84670: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 32134 1727204451.84758: in run() - task 12b410aa-8751-753f-5162-000000000068 32134 1727204451.84772: variable 'ansible_search_path' from source: unknown 32134 1727204451.84776: variable 'ansible_search_path' from source: unknown 32134 1727204451.84813: calling self._execute() 32134 1727204451.84898: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204451.84905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204451.84918: variable 'omit' from source: magic vars 32134 1727204451.85252: variable 'ansible_distribution_major_version' from source: facts 32134 1727204451.85264: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204451.85360: variable 'connection_failed' from source: set_fact 32134 1727204451.85366: Evaluated conditional (not connection_failed): True 32134 1727204451.85461: variable 'ansible_distribution_major_version' from source: facts 32134 1727204451.85465: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204451.85554: variable 'connection_failed' from source: set_fact 32134 1727204451.85558: Evaluated conditional (not connection_failed): True 32134 1727204451.85568: variable 'omit' from source: magic vars 32134 1727204451.85603: variable 'omit' from source: magic vars 32134 1727204451.85682: variable 'network_provider' from source: set_fact 32134 1727204451.85702: variable 'omit' from source: magic vars 32134 1727204451.85741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204451.85772: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204451.85790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204451.85810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204451.85825: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204451.85853: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204451.85857: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204451.85861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204451.85952: Set connection var ansible_timeout to 10 32134 1727204451.85965: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204451.85968: Set connection var ansible_connection to ssh 32134 1727204451.85971: Set connection var ansible_shell_type to sh 32134 1727204451.85978: Set connection var ansible_shell_executable to /bin/sh 32134 1727204451.85984: Set connection var ansible_pipelining to False 32134 1727204451.86005: variable 'ansible_shell_executable' from source: unknown 32134 1727204451.86009: variable 'ansible_connection' from source: unknown 32134 1727204451.86012: variable 'ansible_module_compression' from source: unknown 32134 1727204451.86020: variable 'ansible_shell_type' from source: unknown 32134 1727204451.86023: variable 'ansible_shell_executable' from source: unknown 32134 1727204451.86026: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204451.86036: variable 'ansible_pipelining' from source: unknown 32134 1727204451.86039: variable 'ansible_timeout' from source: unknown 32134 1727204451.86041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204451.86164: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204451.86175: variable 'omit' from source: magic vars 32134 1727204451.86181: starting attempt loop 32134 1727204451.86184: running the handler 32134 1727204451.86230: handler run complete 32134 1727204451.86243: attempt loop complete, returning result 32134 1727204451.86248: _execute() done 32134 1727204451.86251: dumping result to json 32134 1727204451.86254: done dumping result, returning 32134 1727204451.86332: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-753f-5162-000000000068] 32134 1727204451.86336: sending task result for task 12b410aa-8751-753f-5162-000000000068 32134 1727204451.86404: done sending task result for task 12b410aa-8751-753f-5162-000000000068 32134 1727204451.86408: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 32134 1727204451.86494: no more pending results, returning what we have 32134 1727204451.86497: results queue empty 32134 1727204451.86498: checking for any_errors_fatal 32134 1727204451.86506: done checking for any_errors_fatal 32134 1727204451.86507: checking for max_fail_percentage 32134 1727204451.86509: done checking for max_fail_percentage 32134 1727204451.86510: checking to see if all hosts have failed and the running result is not ok 32134 1727204451.86511: done checking to see if all hosts have failed 32134 1727204451.86511: getting the remaining hosts for this loop 32134 1727204451.86513: done getting the remaining hosts for this loop 32134 1727204451.86518: getting the next task for host managed-node2 32134 1727204451.86523: done getting next task for host managed-node2 32134 1727204451.86527: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 32134 1727204451.86529: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204451.86539: getting variables 32134 1727204451.86540: in VariableManager get_vars() 32134 1727204451.86576: Calling all_inventory to load vars for managed-node2 32134 1727204451.86579: Calling groups_inventory to load vars for managed-node2 32134 1727204451.86582: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204451.86597: Calling all_plugins_play to load vars for managed-node2 32134 1727204451.86599: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204451.86602: Calling groups_plugins_play to load vars for managed-node2 32134 1727204451.87914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204451.89532: done with get_vars() 32134 1727204451.89555: done getting variables 32134 1727204451.89606: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:00:51 -0400 (0:00:00.054) 0:00:26.300 ***** 32134 1727204451.89635: entering _queue_task() for managed-node2/fail 32134 1727204451.89888: worker is 1 (out of 1 available) 32134 1727204451.89904: exiting _queue_task() for managed-node2/fail 32134 1727204451.89920: done queuing things up, now waiting for results queue to drain 32134 1727204451.89923: waiting for pending results... 32134 1727204451.90125: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 32134 1727204451.90210: in run() - task 12b410aa-8751-753f-5162-000000000069 32134 1727204451.90227: variable 'ansible_search_path' from source: unknown 32134 1727204451.90231: variable 'ansible_search_path' from source: unknown 32134 1727204451.90264: calling self._execute() 32134 1727204451.90350: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204451.90358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204451.90369: variable 'omit' from source: magic vars 32134 1727204451.90696: variable 'ansible_distribution_major_version' from source: facts 32134 1727204451.90707: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204451.90805: variable 'connection_failed' from source: set_fact 32134 1727204451.90809: Evaluated conditional (not connection_failed): True 32134 1727204451.90902: variable 'ansible_distribution_major_version' from source: facts 32134 1727204451.90907: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204451.91030: variable 'connection_failed' from source: set_fact 32134 1727204451.91034: Evaluated conditional (not connection_failed): True 32134 1727204451.91098: variable 'network_state' from source: role '' defaults 32134 1727204451.91107: Evaluated conditional (network_state != {}): False 32134 1727204451.91110: when evaluation is False, skipping this task 32134 1727204451.91116: _execute() done 32134 1727204451.91119: dumping result to json 32134 1727204451.91122: done dumping result, returning 32134 1727204451.91132: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-753f-5162-000000000069] 32134 1727204451.91135: sending task result for task 12b410aa-8751-753f-5162-000000000069 32134 1727204451.91237: done sending task result for task 12b410aa-8751-753f-5162-000000000069 32134 1727204451.91240: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32134 1727204451.91298: no more pending results, returning what we have 32134 1727204451.91302: results queue empty 32134 1727204451.91303: checking for any_errors_fatal 32134 1727204451.91309: done checking for any_errors_fatal 32134 1727204451.91310: checking for max_fail_percentage 32134 1727204451.91314: done checking for max_fail_percentage 32134 1727204451.91315: checking to see if all hosts have failed and the running result is not ok 32134 1727204451.91316: done checking to see if all hosts have failed 32134 1727204451.91317: getting the remaining hosts for this loop 32134 1727204451.91318: done getting the remaining hosts for this loop 32134 1727204451.91322: getting the next task for host managed-node2 32134 1727204451.91328: done getting next task for host managed-node2 32134 1727204451.91332: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 32134 1727204451.91335: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204451.91349: getting variables 32134 1727204451.91353: in VariableManager get_vars() 32134 1727204451.91387: Calling all_inventory to load vars for managed-node2 32134 1727204451.91391: Calling groups_inventory to load vars for managed-node2 32134 1727204451.91394: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204451.91405: Calling all_plugins_play to load vars for managed-node2 32134 1727204451.91408: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204451.91414: Calling groups_plugins_play to load vars for managed-node2 32134 1727204451.92613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204451.94229: done with get_vars() 32134 1727204451.94251: done getting variables 32134 1727204451.94304: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:00:51 -0400 (0:00:00.046) 0:00:26.347 ***** 32134 1727204451.94330: entering _queue_task() for managed-node2/fail 32134 1727204451.94569: worker is 1 (out of 1 available) 32134 1727204451.94583: exiting _queue_task() for managed-node2/fail 32134 1727204451.94597: done queuing things up, now waiting for results queue to drain 32134 1727204451.94599: waiting for pending results... 32134 1727204451.94797: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 32134 1727204451.94878: in run() - task 12b410aa-8751-753f-5162-00000000006a 32134 1727204451.94891: variable 'ansible_search_path' from source: unknown 32134 1727204451.94896: variable 'ansible_search_path' from source: unknown 32134 1727204451.94927: calling self._execute() 32134 1727204451.95015: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204451.95019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204451.95029: variable 'omit' from source: magic vars 32134 1727204451.95342: variable 'ansible_distribution_major_version' from source: facts 32134 1727204451.95353: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204451.95449: variable 'connection_failed' from source: set_fact 32134 1727204451.95453: Evaluated conditional (not connection_failed): True 32134 1727204451.95552: variable 'ansible_distribution_major_version' from source: facts 32134 1727204451.95556: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204451.95644: variable 'connection_failed' from source: set_fact 32134 1727204451.95648: Evaluated conditional (not connection_failed): True 32134 1727204451.95751: variable 'network_state' from source: role '' defaults 32134 1727204451.95761: Evaluated conditional (network_state != {}): False 32134 1727204451.95764: when evaluation is False, skipping this task 32134 1727204451.95767: _execute() done 32134 1727204451.95772: dumping result to json 32134 1727204451.95775: done dumping result, returning 32134 1727204451.95784: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-753f-5162-00000000006a] 32134 1727204451.95791: sending task result for task 12b410aa-8751-753f-5162-00000000006a 32134 1727204451.95886: done sending task result for task 12b410aa-8751-753f-5162-00000000006a 32134 1727204451.95888: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32134 1727204451.95960: no more pending results, returning what we have 32134 1727204451.95964: results queue empty 32134 1727204451.95965: checking for any_errors_fatal 32134 1727204451.95971: done checking for any_errors_fatal 32134 1727204451.95971: checking for max_fail_percentage 32134 1727204451.95973: done checking for max_fail_percentage 32134 1727204451.95974: checking to see if all hosts have failed and the running result is not ok 32134 1727204451.95975: done checking to see if all hosts have failed 32134 1727204451.95976: getting the remaining hosts for this loop 32134 1727204451.95977: done getting the remaining hosts for this loop 32134 1727204451.95981: getting the next task for host managed-node2 32134 1727204451.95986: done getting next task for host managed-node2 32134 1727204451.95992: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 32134 1727204451.95995: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204451.96010: getting variables 32134 1727204451.96013: in VariableManager get_vars() 32134 1727204451.96047: Calling all_inventory to load vars for managed-node2 32134 1727204451.96050: Calling groups_inventory to load vars for managed-node2 32134 1727204451.96053: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204451.96062: Calling all_plugins_play to load vars for managed-node2 32134 1727204451.96064: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204451.96067: Calling groups_plugins_play to load vars for managed-node2 32134 1727204451.97416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204451.99253: done with get_vars() 32134 1727204451.99278: done getting variables 32134 1727204451.99332: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:00:51 -0400 (0:00:00.050) 0:00:26.397 ***** 32134 1727204451.99357: entering _queue_task() for managed-node2/fail 32134 1727204451.99617: worker is 1 (out of 1 available) 32134 1727204451.99632: exiting _queue_task() for managed-node2/fail 32134 1727204451.99646: done queuing things up, now waiting for results queue to drain 32134 1727204451.99648: waiting for pending results... 32134 1727204451.99852: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 32134 1727204451.99934: in run() - task 12b410aa-8751-753f-5162-00000000006b 32134 1727204451.99949: variable 'ansible_search_path' from source: unknown 32134 1727204451.99953: variable 'ansible_search_path' from source: unknown 32134 1727204451.99985: calling self._execute() 32134 1727204452.00076: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204452.00081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204452.00098: variable 'omit' from source: magic vars 32134 1727204452.00420: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.00440: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.00570: variable 'connection_failed' from source: set_fact 32134 1727204452.00794: Evaluated conditional (not connection_failed): True 32134 1727204452.00798: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.00801: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.00860: variable 'connection_failed' from source: set_fact 32134 1727204452.00871: Evaluated conditional (not connection_failed): True 32134 1727204452.01088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204452.03667: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204452.03754: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204452.03809: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204452.03856: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204452.03903: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204452.04003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.04060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.04100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.04156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.04177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.04295: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.04320: Evaluated conditional (ansible_distribution_major_version | int > 9): True 32134 1727204452.04475: variable 'ansible_distribution' from source: facts 32134 1727204452.04486: variable '__network_rh_distros' from source: role '' defaults 32134 1727204452.04502: Evaluated conditional (ansible_distribution in __network_rh_distros): False 32134 1727204452.04511: when evaluation is False, skipping this task 32134 1727204452.04520: _execute() done 32134 1727204452.04527: dumping result to json 32134 1727204452.04536: done dumping result, returning 32134 1727204452.04548: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-753f-5162-00000000006b] 32134 1727204452.04559: sending task result for task 12b410aa-8751-753f-5162-00000000006b skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 32134 1727204452.04732: no more pending results, returning what we have 32134 1727204452.04736: results queue empty 32134 1727204452.04737: checking for any_errors_fatal 32134 1727204452.04745: done checking for any_errors_fatal 32134 1727204452.04746: checking for max_fail_percentage 32134 1727204452.04748: done checking for max_fail_percentage 32134 1727204452.04749: checking to see if all hosts have failed and the running result is not ok 32134 1727204452.04750: done checking to see if all hosts have failed 32134 1727204452.04751: getting the remaining hosts for this loop 32134 1727204452.04752: done getting the remaining hosts for this loop 32134 1727204452.04757: getting the next task for host managed-node2 32134 1727204452.04764: done getting next task for host managed-node2 32134 1727204452.04995: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 32134 1727204452.04998: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204452.05013: done sending task result for task 12b410aa-8751-753f-5162-00000000006b 32134 1727204452.05017: WORKER PROCESS EXITING 32134 1727204452.05026: getting variables 32134 1727204452.05027: in VariableManager get_vars() 32134 1727204452.05067: Calling all_inventory to load vars for managed-node2 32134 1727204452.05071: Calling groups_inventory to load vars for managed-node2 32134 1727204452.05074: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204452.05084: Calling all_plugins_play to load vars for managed-node2 32134 1727204452.05088: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204452.05094: Calling groups_plugins_play to load vars for managed-node2 32134 1727204452.07041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204452.09114: done with get_vars() 32134 1727204452.09151: done getting variables 32134 1727204452.09221: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:00:52 -0400 (0:00:00.098) 0:00:26.496 ***** 32134 1727204452.09256: entering _queue_task() for managed-node2/dnf 32134 1727204452.09605: worker is 1 (out of 1 available) 32134 1727204452.09619: exiting _queue_task() for managed-node2/dnf 32134 1727204452.09631: done queuing things up, now waiting for results queue to drain 32134 1727204452.09633: waiting for pending results... 32134 1727204452.09948: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 32134 1727204452.10079: in run() - task 12b410aa-8751-753f-5162-00000000006c 32134 1727204452.10104: variable 'ansible_search_path' from source: unknown 32134 1727204452.10118: variable 'ansible_search_path' from source: unknown 32134 1727204452.10163: calling self._execute() 32134 1727204452.10281: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204452.10298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204452.10314: variable 'omit' from source: magic vars 32134 1727204452.10771: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.10791: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.10935: variable 'connection_failed' from source: set_fact 32134 1727204452.10947: Evaluated conditional (not connection_failed): True 32134 1727204452.11098: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.11112: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.11247: variable 'connection_failed' from source: set_fact 32134 1727204452.11258: Evaluated conditional (not connection_failed): True 32134 1727204452.11517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204452.14228: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204452.14317: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204452.14368: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204452.14418: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204452.14456: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204452.14568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.14613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.14655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.14751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.14754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.14883: variable 'ansible_distribution' from source: facts 32134 1727204452.14896: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.14909: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 32134 1727204452.15059: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204452.15250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.15296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.15326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.15406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.15409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.15462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.15498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.15694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.15697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.15700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.15703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.15705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.15725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.15777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.15799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.16028: variable 'network_connections' from source: play vars 32134 1727204452.16050: variable 'profile' from source: play vars 32134 1727204452.16145: variable 'profile' from source: play vars 32134 1727204452.16157: variable 'interface' from source: set_fact 32134 1727204452.16236: variable 'interface' from source: set_fact 32134 1727204452.16362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204452.16554: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204452.16611: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204452.16652: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204452.16796: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204452.16799: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204452.16802: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204452.16826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.16861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204452.16925: variable '__network_team_connections_defined' from source: role '' defaults 32134 1727204452.17272: variable 'network_connections' from source: play vars 32134 1727204452.17283: variable 'profile' from source: play vars 32134 1727204452.17367: variable 'profile' from source: play vars 32134 1727204452.17377: variable 'interface' from source: set_fact 32134 1727204452.17455: variable 'interface' from source: set_fact 32134 1727204452.17488: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32134 1727204452.17500: when evaluation is False, skipping this task 32134 1727204452.17508: _execute() done 32134 1727204452.17515: dumping result to json 32134 1727204452.17523: done dumping result, returning 32134 1727204452.17536: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-753f-5162-00000000006c] 32134 1727204452.17547: sending task result for task 12b410aa-8751-753f-5162-00000000006c skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32134 1727204452.17721: no more pending results, returning what we have 32134 1727204452.17726: results queue empty 32134 1727204452.17727: checking for any_errors_fatal 32134 1727204452.17738: done checking for any_errors_fatal 32134 1727204452.17739: checking for max_fail_percentage 32134 1727204452.17741: done checking for max_fail_percentage 32134 1727204452.17743: checking to see if all hosts have failed and the running result is not ok 32134 1727204452.17744: done checking to see if all hosts have failed 32134 1727204452.17745: getting the remaining hosts for this loop 32134 1727204452.17746: done getting the remaining hosts for this loop 32134 1727204452.17751: getting the next task for host managed-node2 32134 1727204452.17759: done getting next task for host managed-node2 32134 1727204452.17764: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 32134 1727204452.17767: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204452.17783: getting variables 32134 1727204452.17785: in VariableManager get_vars() 32134 1727204452.17832: Calling all_inventory to load vars for managed-node2 32134 1727204452.17835: Calling groups_inventory to load vars for managed-node2 32134 1727204452.17838: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204452.17851: Calling all_plugins_play to load vars for managed-node2 32134 1727204452.17855: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204452.17858: Calling groups_plugins_play to load vars for managed-node2 32134 1727204452.18716: done sending task result for task 12b410aa-8751-753f-5162-00000000006c 32134 1727204452.18719: WORKER PROCESS EXITING 32134 1727204452.20441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204452.23421: done with get_vars() 32134 1727204452.23460: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 32134 1727204452.23551: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:00:52 -0400 (0:00:00.143) 0:00:26.639 ***** 32134 1727204452.23585: entering _queue_task() for managed-node2/yum 32134 1727204452.23926: worker is 1 (out of 1 available) 32134 1727204452.23939: exiting _queue_task() for managed-node2/yum 32134 1727204452.23953: done queuing things up, now waiting for results queue to drain 32134 1727204452.23955: waiting for pending results... 32134 1727204452.24262: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 32134 1727204452.24397: in run() - task 12b410aa-8751-753f-5162-00000000006d 32134 1727204452.24424: variable 'ansible_search_path' from source: unknown 32134 1727204452.24432: variable 'ansible_search_path' from source: unknown 32134 1727204452.24475: calling self._execute() 32134 1727204452.24588: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204452.24604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204452.24620: variable 'omit' from source: magic vars 32134 1727204452.25067: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.25086: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.25234: variable 'connection_failed' from source: set_fact 32134 1727204452.25395: Evaluated conditional (not connection_failed): True 32134 1727204452.25398: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.25400: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.25532: variable 'connection_failed' from source: set_fact 32134 1727204452.25544: Evaluated conditional (not connection_failed): True 32134 1727204452.25771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204452.28776: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204452.28865: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204452.28918: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204452.28966: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204452.29008: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204452.29107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.29154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.29224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.29254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.29278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.29400: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.29424: Evaluated conditional (ansible_distribution_major_version | int < 8): False 32134 1727204452.29442: when evaluation is False, skipping this task 32134 1727204452.29549: _execute() done 32134 1727204452.29553: dumping result to json 32134 1727204452.29556: done dumping result, returning 32134 1727204452.29558: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-753f-5162-00000000006d] 32134 1727204452.29561: sending task result for task 12b410aa-8751-753f-5162-00000000006d 32134 1727204452.29646: done sending task result for task 12b410aa-8751-753f-5162-00000000006d 32134 1727204452.29649: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 32134 1727204452.29711: no more pending results, returning what we have 32134 1727204452.29716: results queue empty 32134 1727204452.29717: checking for any_errors_fatal 32134 1727204452.29724: done checking for any_errors_fatal 32134 1727204452.29725: checking for max_fail_percentage 32134 1727204452.29727: done checking for max_fail_percentage 32134 1727204452.29728: checking to see if all hosts have failed and the running result is not ok 32134 1727204452.29730: done checking to see if all hosts have failed 32134 1727204452.29731: getting the remaining hosts for this loop 32134 1727204452.29732: done getting the remaining hosts for this loop 32134 1727204452.29737: getting the next task for host managed-node2 32134 1727204452.29745: done getting next task for host managed-node2 32134 1727204452.29751: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 32134 1727204452.29753: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204452.29770: getting variables 32134 1727204452.29772: in VariableManager get_vars() 32134 1727204452.30019: Calling all_inventory to load vars for managed-node2 32134 1727204452.30023: Calling groups_inventory to load vars for managed-node2 32134 1727204452.30026: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204452.30039: Calling all_plugins_play to load vars for managed-node2 32134 1727204452.30042: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204452.30047: Calling groups_plugins_play to load vars for managed-node2 32134 1727204452.32546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204452.35596: done with get_vars() 32134 1727204452.35638: done getting variables 32134 1727204452.35714: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:00:52 -0400 (0:00:00.121) 0:00:26.761 ***** 32134 1727204452.35752: entering _queue_task() for managed-node2/fail 32134 1727204452.36319: worker is 1 (out of 1 available) 32134 1727204452.36331: exiting _queue_task() for managed-node2/fail 32134 1727204452.36341: done queuing things up, now waiting for results queue to drain 32134 1727204452.36344: waiting for pending results... 32134 1727204452.36588: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 32134 1727204452.36632: in run() - task 12b410aa-8751-753f-5162-00000000006e 32134 1727204452.36659: variable 'ansible_search_path' from source: unknown 32134 1727204452.36668: variable 'ansible_search_path' from source: unknown 32134 1727204452.36792: calling self._execute() 32134 1727204452.36846: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204452.36861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204452.36881: variable 'omit' from source: magic vars 32134 1727204452.37363: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.37383: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.37534: variable 'connection_failed' from source: set_fact 32134 1727204452.37552: Evaluated conditional (not connection_failed): True 32134 1727204452.37700: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.37713: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.37845: variable 'connection_failed' from source: set_fact 32134 1727204452.37857: Evaluated conditional (not connection_failed): True 32134 1727204452.38012: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204452.38266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204452.40938: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204452.41024: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204452.41076: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204452.41124: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204452.41248: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204452.41262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.41318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.41354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.41415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.41437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.41504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.41538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.41573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.41633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.41654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.41714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.41748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.41795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.41839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.41906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.42093: variable 'network_connections' from source: play vars 32134 1727204452.42111: variable 'profile' from source: play vars 32134 1727204452.42207: variable 'profile' from source: play vars 32134 1727204452.42217: variable 'interface' from source: set_fact 32134 1727204452.42301: variable 'interface' from source: set_fact 32134 1727204452.42398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204452.42612: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204452.42666: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204452.42773: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204452.42776: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204452.42805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204452.42836: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204452.42870: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.42914: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204452.42972: variable '__network_team_connections_defined' from source: role '' defaults 32134 1727204452.43314: variable 'network_connections' from source: play vars 32134 1727204452.43331: variable 'profile' from source: play vars 32134 1727204452.43411: variable 'profile' from source: play vars 32134 1727204452.43431: variable 'interface' from source: set_fact 32134 1727204452.43541: variable 'interface' from source: set_fact 32134 1727204452.43551: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32134 1727204452.43560: when evaluation is False, skipping this task 32134 1727204452.43593: _execute() done 32134 1727204452.43596: dumping result to json 32134 1727204452.43598: done dumping result, returning 32134 1727204452.43601: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-753f-5162-00000000006e] 32134 1727204452.43603: sending task result for task 12b410aa-8751-753f-5162-00000000006e skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32134 1727204452.43947: no more pending results, returning what we have 32134 1727204452.43950: results queue empty 32134 1727204452.43952: checking for any_errors_fatal 32134 1727204452.43961: done checking for any_errors_fatal 32134 1727204452.43962: checking for max_fail_percentage 32134 1727204452.43964: done checking for max_fail_percentage 32134 1727204452.43965: checking to see if all hosts have failed and the running result is not ok 32134 1727204452.43967: done checking to see if all hosts have failed 32134 1727204452.43968: getting the remaining hosts for this loop 32134 1727204452.43969: done getting the remaining hosts for this loop 32134 1727204452.43973: getting the next task for host managed-node2 32134 1727204452.43980: done getting next task for host managed-node2 32134 1727204452.43985: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 32134 1727204452.43988: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204452.44005: getting variables 32134 1727204452.44007: in VariableManager get_vars() 32134 1727204452.44049: Calling all_inventory to load vars for managed-node2 32134 1727204452.44052: Calling groups_inventory to load vars for managed-node2 32134 1727204452.44056: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204452.44067: Calling all_plugins_play to load vars for managed-node2 32134 1727204452.44071: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204452.44074: Calling groups_plugins_play to load vars for managed-node2 32134 1727204452.44607: done sending task result for task 12b410aa-8751-753f-5162-00000000006e 32134 1727204452.44610: WORKER PROCESS EXITING 32134 1727204452.46646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204452.49647: done with get_vars() 32134 1727204452.49691: done getting variables 32134 1727204452.49762: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:00:52 -0400 (0:00:00.140) 0:00:26.901 ***** 32134 1727204452.49802: entering _queue_task() for managed-node2/package 32134 1727204452.50161: worker is 1 (out of 1 available) 32134 1727204452.50176: exiting _queue_task() for managed-node2/package 32134 1727204452.50191: done queuing things up, now waiting for results queue to drain 32134 1727204452.50193: waiting for pending results... 32134 1727204452.50498: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 32134 1727204452.50624: in run() - task 12b410aa-8751-753f-5162-00000000006f 32134 1727204452.50646: variable 'ansible_search_path' from source: unknown 32134 1727204452.50654: variable 'ansible_search_path' from source: unknown 32134 1727204452.50699: calling self._execute() 32134 1727204452.50814: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204452.50832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204452.50850: variable 'omit' from source: magic vars 32134 1727204452.51309: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.51327: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.51472: variable 'connection_failed' from source: set_fact 32134 1727204452.51488: Evaluated conditional (not connection_failed): True 32134 1727204452.51640: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.51652: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.51782: variable 'connection_failed' from source: set_fact 32134 1727204452.51795: Evaluated conditional (not connection_failed): True 32134 1727204452.52058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204452.52377: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204452.52440: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204452.52494: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204452.52579: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204452.52708: variable 'network_packages' from source: role '' defaults 32134 1727204452.52845: variable '__network_provider_setup' from source: role '' defaults 32134 1727204452.52864: variable '__network_service_name_default_nm' from source: role '' defaults 32134 1727204452.52961: variable '__network_service_name_default_nm' from source: role '' defaults 32134 1727204452.52978: variable '__network_packages_default_nm' from source: role '' defaults 32134 1727204452.53096: variable '__network_packages_default_nm' from source: role '' defaults 32134 1727204452.53339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204452.55723: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204452.55813: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204452.55857: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204452.55994: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204452.55998: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204452.56034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.56070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.56105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.56166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.56192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.56258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.56293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.56331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.56386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.56409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.56711: variable '__network_packages_default_gobject_packages' from source: role '' defaults 32134 1727204452.56869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.56908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.57000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.57004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.57025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.57144: variable 'ansible_python' from source: facts 32134 1727204452.57182: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 32134 1727204452.57293: variable '__network_wpa_supplicant_required' from source: role '' defaults 32134 1727204452.57444: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32134 1727204452.57592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.57626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.57665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.57729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.57751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.57878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.57882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.57894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.57947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.57970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.58160: variable 'network_connections' from source: play vars 32134 1727204452.58173: variable 'profile' from source: play vars 32134 1727204452.58301: variable 'profile' from source: play vars 32134 1727204452.58324: variable 'interface' from source: set_fact 32134 1727204452.58434: variable 'interface' from source: set_fact 32134 1727204452.58504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204452.58543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204452.58578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.58653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204452.58685: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204452.59056: variable 'network_connections' from source: play vars 32134 1727204452.59068: variable 'profile' from source: play vars 32134 1727204452.59197: variable 'profile' from source: play vars 32134 1727204452.59296: variable 'interface' from source: set_fact 32134 1727204452.59299: variable 'interface' from source: set_fact 32134 1727204452.59345: variable '__network_packages_default_wireless' from source: role '' defaults 32134 1727204452.59456: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204452.59882: variable 'network_connections' from source: play vars 32134 1727204452.59896: variable 'profile' from source: play vars 32134 1727204452.59982: variable 'profile' from source: play vars 32134 1727204452.59997: variable 'interface' from source: set_fact 32134 1727204452.60122: variable 'interface' from source: set_fact 32134 1727204452.60157: variable '__network_packages_default_team' from source: role '' defaults 32134 1727204452.60270: variable '__network_team_connections_defined' from source: role '' defaults 32134 1727204452.60696: variable 'network_connections' from source: play vars 32134 1727204452.60708: variable 'profile' from source: play vars 32134 1727204452.60796: variable 'profile' from source: play vars 32134 1727204452.60807: variable 'interface' from source: set_fact 32134 1727204452.60933: variable 'interface' from source: set_fact 32134 1727204452.61018: variable '__network_service_name_default_initscripts' from source: role '' defaults 32134 1727204452.61097: variable '__network_service_name_default_initscripts' from source: role '' defaults 32134 1727204452.61114: variable '__network_packages_default_initscripts' from source: role '' defaults 32134 1727204452.61198: variable '__network_packages_default_initscripts' from source: role '' defaults 32134 1727204452.61508: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 32134 1727204452.62169: variable 'network_connections' from source: play vars 32134 1727204452.62181: variable 'profile' from source: play vars 32134 1727204452.62268: variable 'profile' from source: play vars 32134 1727204452.62278: variable 'interface' from source: set_fact 32134 1727204452.62495: variable 'interface' from source: set_fact 32134 1727204452.62498: variable 'ansible_distribution' from source: facts 32134 1727204452.62500: variable '__network_rh_distros' from source: role '' defaults 32134 1727204452.62503: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.62505: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 32134 1727204452.62642: variable 'ansible_distribution' from source: facts 32134 1727204452.62652: variable '__network_rh_distros' from source: role '' defaults 32134 1727204452.62662: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.62673: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 32134 1727204452.62904: variable 'ansible_distribution' from source: facts 32134 1727204452.62913: variable '__network_rh_distros' from source: role '' defaults 32134 1727204452.62924: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.62974: variable 'network_provider' from source: set_fact 32134 1727204452.62999: variable 'ansible_facts' from source: unknown 32134 1727204452.64223: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 32134 1727204452.64233: when evaluation is False, skipping this task 32134 1727204452.64243: _execute() done 32134 1727204452.64255: dumping result to json 32134 1727204452.64264: done dumping result, returning 32134 1727204452.64276: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-753f-5162-00000000006f] 32134 1727204452.64287: sending task result for task 12b410aa-8751-753f-5162-00000000006f 32134 1727204452.64549: done sending task result for task 12b410aa-8751-753f-5162-00000000006f 32134 1727204452.64552: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 32134 1727204452.64613: no more pending results, returning what we have 32134 1727204452.64618: results queue empty 32134 1727204452.64619: checking for any_errors_fatal 32134 1727204452.64630: done checking for any_errors_fatal 32134 1727204452.64631: checking for max_fail_percentage 32134 1727204452.64634: done checking for max_fail_percentage 32134 1727204452.64635: checking to see if all hosts have failed and the running result is not ok 32134 1727204452.64636: done checking to see if all hosts have failed 32134 1727204452.64637: getting the remaining hosts for this loop 32134 1727204452.64639: done getting the remaining hosts for this loop 32134 1727204452.64644: getting the next task for host managed-node2 32134 1727204452.64652: done getting next task for host managed-node2 32134 1727204452.64657: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 32134 1727204452.64665: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204452.64681: getting variables 32134 1727204452.64684: in VariableManager get_vars() 32134 1727204452.64928: Calling all_inventory to load vars for managed-node2 32134 1727204452.64932: Calling groups_inventory to load vars for managed-node2 32134 1727204452.64936: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204452.64948: Calling all_plugins_play to load vars for managed-node2 32134 1727204452.64951: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204452.64955: Calling groups_plugins_play to load vars for managed-node2 32134 1727204452.67172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204452.70178: done with get_vars() 32134 1727204452.70228: done getting variables 32134 1727204452.70308: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:00:52 -0400 (0:00:00.205) 0:00:27.107 ***** 32134 1727204452.70350: entering _queue_task() for managed-node2/package 32134 1727204452.71340: worker is 1 (out of 1 available) 32134 1727204452.71355: exiting _queue_task() for managed-node2/package 32134 1727204452.71369: done queuing things up, now waiting for results queue to drain 32134 1727204452.71371: waiting for pending results... 32134 1727204452.72109: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 32134 1727204452.72359: in run() - task 12b410aa-8751-753f-5162-000000000070 32134 1727204452.72363: variable 'ansible_search_path' from source: unknown 32134 1727204452.72367: variable 'ansible_search_path' from source: unknown 32134 1727204452.72797: calling self._execute() 32134 1727204452.73008: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204452.73018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204452.73036: variable 'omit' from source: magic vars 32134 1727204452.74009: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.74081: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.74359: variable 'connection_failed' from source: set_fact 32134 1727204452.74437: Evaluated conditional (not connection_failed): True 32134 1727204452.74760: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.74775: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.74998: variable 'connection_failed' from source: set_fact 32134 1727204452.75117: Evaluated conditional (not connection_failed): True 32134 1727204452.75549: variable 'network_state' from source: role '' defaults 32134 1727204452.75553: Evaluated conditional (network_state != {}): False 32134 1727204452.75555: when evaluation is False, skipping this task 32134 1727204452.75558: _execute() done 32134 1727204452.75560: dumping result to json 32134 1727204452.75562: done dumping result, returning 32134 1727204452.75565: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-753f-5162-000000000070] 32134 1727204452.75567: sending task result for task 12b410aa-8751-753f-5162-000000000070 32134 1727204452.75969: done sending task result for task 12b410aa-8751-753f-5162-000000000070 32134 1727204452.75973: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32134 1727204452.76151: no more pending results, returning what we have 32134 1727204452.76157: results queue empty 32134 1727204452.76158: checking for any_errors_fatal 32134 1727204452.76171: done checking for any_errors_fatal 32134 1727204452.76172: checking for max_fail_percentage 32134 1727204452.76174: done checking for max_fail_percentage 32134 1727204452.76175: checking to see if all hosts have failed and the running result is not ok 32134 1727204452.76176: done checking to see if all hosts have failed 32134 1727204452.76177: getting the remaining hosts for this loop 32134 1727204452.76178: done getting the remaining hosts for this loop 32134 1727204452.76183: getting the next task for host managed-node2 32134 1727204452.76194: done getting next task for host managed-node2 32134 1727204452.76199: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 32134 1727204452.76202: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204452.76220: getting variables 32134 1727204452.76222: in VariableManager get_vars() 32134 1727204452.76266: Calling all_inventory to load vars for managed-node2 32134 1727204452.76269: Calling groups_inventory to load vars for managed-node2 32134 1727204452.76272: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204452.76286: Calling all_plugins_play to load vars for managed-node2 32134 1727204452.76493: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204452.76500: Calling groups_plugins_play to load vars for managed-node2 32134 1727204452.80355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204452.83600: done with get_vars() 32134 1727204452.83656: done getting variables 32134 1727204452.83735: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:00:52 -0400 (0:00:00.134) 0:00:27.241 ***** 32134 1727204452.83776: entering _queue_task() for managed-node2/package 32134 1727204452.84183: worker is 1 (out of 1 available) 32134 1727204452.84201: exiting _queue_task() for managed-node2/package 32134 1727204452.84218: done queuing things up, now waiting for results queue to drain 32134 1727204452.84220: waiting for pending results... 32134 1727204452.84606: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 32134 1727204452.84611: in run() - task 12b410aa-8751-753f-5162-000000000071 32134 1727204452.84637: variable 'ansible_search_path' from source: unknown 32134 1727204452.84647: variable 'ansible_search_path' from source: unknown 32134 1727204452.84695: calling self._execute() 32134 1727204452.84813: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204452.84828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204452.84847: variable 'omit' from source: magic vars 32134 1727204452.85310: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.85332: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.85481: variable 'connection_failed' from source: set_fact 32134 1727204452.85497: Evaluated conditional (not connection_failed): True 32134 1727204452.85642: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.85655: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.85792: variable 'connection_failed' from source: set_fact 32134 1727204452.85805: Evaluated conditional (not connection_failed): True 32134 1727204452.85961: variable 'network_state' from source: role '' defaults 32134 1727204452.86194: Evaluated conditional (network_state != {}): False 32134 1727204452.86197: when evaluation is False, skipping this task 32134 1727204452.86200: _execute() done 32134 1727204452.86202: dumping result to json 32134 1727204452.86204: done dumping result, returning 32134 1727204452.86207: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-753f-5162-000000000071] 32134 1727204452.86209: sending task result for task 12b410aa-8751-753f-5162-000000000071 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32134 1727204452.86345: no more pending results, returning what we have 32134 1727204452.86349: results queue empty 32134 1727204452.86351: checking for any_errors_fatal 32134 1727204452.86358: done checking for any_errors_fatal 32134 1727204452.86359: checking for max_fail_percentage 32134 1727204452.86360: done checking for max_fail_percentage 32134 1727204452.86361: checking to see if all hosts have failed and the running result is not ok 32134 1727204452.86362: done checking to see if all hosts have failed 32134 1727204452.86363: getting the remaining hosts for this loop 32134 1727204452.86365: done getting the remaining hosts for this loop 32134 1727204452.86371: getting the next task for host managed-node2 32134 1727204452.86377: done getting next task for host managed-node2 32134 1727204452.86381: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 32134 1727204452.86384: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204452.86396: done sending task result for task 12b410aa-8751-753f-5162-000000000071 32134 1727204452.86399: WORKER PROCESS EXITING 32134 1727204452.86504: getting variables 32134 1727204452.86506: in VariableManager get_vars() 32134 1727204452.86552: Calling all_inventory to load vars for managed-node2 32134 1727204452.86556: Calling groups_inventory to load vars for managed-node2 32134 1727204452.86559: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204452.86570: Calling all_plugins_play to load vars for managed-node2 32134 1727204452.86574: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204452.86578: Calling groups_plugins_play to load vars for managed-node2 32134 1727204452.89047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204452.92257: done with get_vars() 32134 1727204452.92321: done getting variables 32134 1727204452.92408: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:00:52 -0400 (0:00:00.086) 0:00:27.328 ***** 32134 1727204452.92449: entering _queue_task() for managed-node2/service 32134 1727204452.92876: worker is 1 (out of 1 available) 32134 1727204452.92999: exiting _queue_task() for managed-node2/service 32134 1727204452.93013: done queuing things up, now waiting for results queue to drain 32134 1727204452.93016: waiting for pending results... 32134 1727204452.93263: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 32134 1727204452.93394: in run() - task 12b410aa-8751-753f-5162-000000000072 32134 1727204452.93418: variable 'ansible_search_path' from source: unknown 32134 1727204452.93423: variable 'ansible_search_path' from source: unknown 32134 1727204452.93470: calling self._execute() 32134 1727204452.93598: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204452.93604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204452.93627: variable 'omit' from source: magic vars 32134 1727204452.94145: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.94156: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.94314: variable 'connection_failed' from source: set_fact 32134 1727204452.94323: Evaluated conditional (not connection_failed): True 32134 1727204452.94471: variable 'ansible_distribution_major_version' from source: facts 32134 1727204452.94477: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204452.94613: variable 'connection_failed' from source: set_fact 32134 1727204452.94622: Evaluated conditional (not connection_failed): True 32134 1727204452.94797: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204452.95061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204452.97896: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204452.97900: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204452.97931: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204452.97975: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204452.98015: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204452.98115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.98166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.98203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.98264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.98280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.98350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.98378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.98409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.98470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.98694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.98698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204452.98702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204452.98704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.98707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204452.98709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204452.98920: variable 'network_connections' from source: play vars 32134 1727204452.98935: variable 'profile' from source: play vars 32134 1727204452.99038: variable 'profile' from source: play vars 32134 1727204452.99041: variable 'interface' from source: set_fact 32134 1727204452.99129: variable 'interface' from source: set_fact 32134 1727204452.99232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204452.99456: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204452.99504: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204452.99551: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204452.99584: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204452.99645: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204452.99672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204452.99704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204452.99738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204452.99807: variable '__network_team_connections_defined' from source: role '' defaults 32134 1727204453.00161: variable 'network_connections' from source: play vars 32134 1727204453.00167: variable 'profile' from source: play vars 32134 1727204453.00252: variable 'profile' from source: play vars 32134 1727204453.00256: variable 'interface' from source: set_fact 32134 1727204453.00340: variable 'interface' from source: set_fact 32134 1727204453.00370: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32134 1727204453.00374: when evaluation is False, skipping this task 32134 1727204453.00379: _execute() done 32134 1727204453.00384: dumping result to json 32134 1727204453.00388: done dumping result, returning 32134 1727204453.00407: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-753f-5162-000000000072] 32134 1727204453.00412: sending task result for task 12b410aa-8751-753f-5162-000000000072 32134 1727204453.00525: done sending task result for task 12b410aa-8751-753f-5162-000000000072 32134 1727204453.00528: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32134 1727204453.00583: no more pending results, returning what we have 32134 1727204453.00587: results queue empty 32134 1727204453.00588: checking for any_errors_fatal 32134 1727204453.00601: done checking for any_errors_fatal 32134 1727204453.00602: checking for max_fail_percentage 32134 1727204453.00604: done checking for max_fail_percentage 32134 1727204453.00605: checking to see if all hosts have failed and the running result is not ok 32134 1727204453.00606: done checking to see if all hosts have failed 32134 1727204453.00607: getting the remaining hosts for this loop 32134 1727204453.00609: done getting the remaining hosts for this loop 32134 1727204453.00617: getting the next task for host managed-node2 32134 1727204453.00625: done getting next task for host managed-node2 32134 1727204453.00630: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 32134 1727204453.00633: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204453.00649: getting variables 32134 1727204453.00651: in VariableManager get_vars() 32134 1727204453.00902: Calling all_inventory to load vars for managed-node2 32134 1727204453.00907: Calling groups_inventory to load vars for managed-node2 32134 1727204453.00910: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204453.00923: Calling all_plugins_play to load vars for managed-node2 32134 1727204453.00927: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204453.00931: Calling groups_plugins_play to load vars for managed-node2 32134 1727204453.03292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204453.06504: done with get_vars() 32134 1727204453.06561: done getting variables 32134 1727204453.06638: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:00:53 -0400 (0:00:00.142) 0:00:27.470 ***** 32134 1727204453.06677: entering _queue_task() for managed-node2/service 32134 1727204453.07076: worker is 1 (out of 1 available) 32134 1727204453.07296: exiting _queue_task() for managed-node2/service 32134 1727204453.07307: done queuing things up, now waiting for results queue to drain 32134 1727204453.07309: waiting for pending results... 32134 1727204453.07511: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 32134 1727204453.07599: in run() - task 12b410aa-8751-753f-5162-000000000073 32134 1727204453.07603: variable 'ansible_search_path' from source: unknown 32134 1727204453.07607: variable 'ansible_search_path' from source: unknown 32134 1727204453.07679: calling self._execute() 32134 1727204453.07750: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204453.07767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204453.07780: variable 'omit' from source: magic vars 32134 1727204453.08267: variable 'ansible_distribution_major_version' from source: facts 32134 1727204453.08279: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204453.08441: variable 'connection_failed' from source: set_fact 32134 1727204453.08474: Evaluated conditional (not connection_failed): True 32134 1727204453.08599: variable 'ansible_distribution_major_version' from source: facts 32134 1727204453.08605: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204453.08739: variable 'connection_failed' from source: set_fact 32134 1727204453.08780: Evaluated conditional (not connection_failed): True 32134 1727204453.08972: variable 'network_provider' from source: set_fact 32134 1727204453.08978: variable 'network_state' from source: role '' defaults 32134 1727204453.09015: Evaluated conditional (network_provider == "nm" or network_state != {}): True 32134 1727204453.09019: variable 'omit' from source: magic vars 32134 1727204453.09050: variable 'omit' from source: magic vars 32134 1727204453.09096: variable 'network_service_name' from source: role '' defaults 32134 1727204453.09219: variable 'network_service_name' from source: role '' defaults 32134 1727204453.09342: variable '__network_provider_setup' from source: role '' defaults 32134 1727204453.09345: variable '__network_service_name_default_nm' from source: role '' defaults 32134 1727204453.09433: variable '__network_service_name_default_nm' from source: role '' defaults 32134 1727204453.09450: variable '__network_packages_default_nm' from source: role '' defaults 32134 1727204453.09559: variable '__network_packages_default_nm' from source: role '' defaults 32134 1727204453.09859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204453.19744: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204453.19833: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204453.19925: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204453.19941: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204453.19976: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204453.20077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204453.20114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204453.20158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204453.20250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204453.20254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204453.20297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204453.20329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204453.20368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204453.20466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204453.20470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204453.20774: variable '__network_packages_default_gobject_packages' from source: role '' defaults 32134 1727204453.20954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204453.20982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204453.21015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204453.21070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204453.21086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204453.21233: variable 'ansible_python' from source: facts 32134 1727204453.21237: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 32134 1727204453.21347: variable '__network_wpa_supplicant_required' from source: role '' defaults 32134 1727204453.21459: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32134 1727204453.21668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204453.21675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204453.21714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204453.21761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204453.21886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204453.21891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204453.21895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204453.21911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204453.21962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204453.21979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204453.22177: variable 'network_connections' from source: play vars 32134 1727204453.22187: variable 'profile' from source: play vars 32134 1727204453.22294: variable 'profile' from source: play vars 32134 1727204453.22302: variable 'interface' from source: set_fact 32134 1727204453.22386: variable 'interface' from source: set_fact 32134 1727204453.22554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204453.22781: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204453.22853: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204453.22936: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204453.22983: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204453.23095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204453.23115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204453.23158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204453.23224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204453.23258: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204453.23894: variable 'network_connections' from source: play vars 32134 1727204453.23898: variable 'profile' from source: play vars 32134 1727204453.23901: variable 'profile' from source: play vars 32134 1727204453.23903: variable 'interface' from source: set_fact 32134 1727204453.23905: variable 'interface' from source: set_fact 32134 1727204453.23929: variable '__network_packages_default_wireless' from source: role '' defaults 32134 1727204453.24038: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204453.24468: variable 'network_connections' from source: play vars 32134 1727204453.24475: variable 'profile' from source: play vars 32134 1727204453.24564: variable 'profile' from source: play vars 32134 1727204453.24579: variable 'interface' from source: set_fact 32134 1727204453.24671: variable 'interface' from source: set_fact 32134 1727204453.24717: variable '__network_packages_default_team' from source: role '' defaults 32134 1727204453.24826: variable '__network_team_connections_defined' from source: role '' defaults 32134 1727204453.25253: variable 'network_connections' from source: play vars 32134 1727204453.25260: variable 'profile' from source: play vars 32134 1727204453.25356: variable 'profile' from source: play vars 32134 1727204453.25362: variable 'interface' from source: set_fact 32134 1727204453.25459: variable 'interface' from source: set_fact 32134 1727204453.25536: variable '__network_service_name_default_initscripts' from source: role '' defaults 32134 1727204453.25621: variable '__network_service_name_default_initscripts' from source: role '' defaults 32134 1727204453.25630: variable '__network_packages_default_initscripts' from source: role '' defaults 32134 1727204453.25714: variable '__network_packages_default_initscripts' from source: role '' defaults 32134 1727204453.26043: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 32134 1727204453.26783: variable 'network_connections' from source: play vars 32134 1727204453.26790: variable 'profile' from source: play vars 32134 1727204453.26882: variable 'profile' from source: play vars 32134 1727204453.26886: variable 'interface' from source: set_fact 32134 1727204453.27083: variable 'interface' from source: set_fact 32134 1727204453.27086: variable 'ansible_distribution' from source: facts 32134 1727204453.27092: variable '__network_rh_distros' from source: role '' defaults 32134 1727204453.27095: variable 'ansible_distribution_major_version' from source: facts 32134 1727204453.27097: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 32134 1727204453.27293: variable 'ansible_distribution' from source: facts 32134 1727204453.27309: variable '__network_rh_distros' from source: role '' defaults 32134 1727204453.27324: variable 'ansible_distribution_major_version' from source: facts 32134 1727204453.27337: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 32134 1727204453.27585: variable 'ansible_distribution' from source: facts 32134 1727204453.27598: variable '__network_rh_distros' from source: role '' defaults 32134 1727204453.27630: variable 'ansible_distribution_major_version' from source: facts 32134 1727204453.27667: variable 'network_provider' from source: set_fact 32134 1727204453.27703: variable 'omit' from source: magic vars 32134 1727204453.27845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204453.27849: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204453.27852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204453.27854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204453.27856: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204453.27872: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204453.27881: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204453.27891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204453.28032: Set connection var ansible_timeout to 10 32134 1727204453.28062: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204453.28071: Set connection var ansible_connection to ssh 32134 1727204453.28085: Set connection var ansible_shell_type to sh 32134 1727204453.28102: Set connection var ansible_shell_executable to /bin/sh 32134 1727204453.28118: Set connection var ansible_pipelining to False 32134 1727204453.28174: variable 'ansible_shell_executable' from source: unknown 32134 1727204453.28177: variable 'ansible_connection' from source: unknown 32134 1727204453.28180: variable 'ansible_module_compression' from source: unknown 32134 1727204453.28280: variable 'ansible_shell_type' from source: unknown 32134 1727204453.28284: variable 'ansible_shell_executable' from source: unknown 32134 1727204453.28286: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204453.28290: variable 'ansible_pipelining' from source: unknown 32134 1727204453.28293: variable 'ansible_timeout' from source: unknown 32134 1727204453.28295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204453.28375: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204453.28396: variable 'omit' from source: magic vars 32134 1727204453.28494: starting attempt loop 32134 1727204453.28498: running the handler 32134 1727204453.28532: variable 'ansible_facts' from source: unknown 32134 1727204453.29756: _low_level_execute_command(): starting 32134 1727204453.29767: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204453.30604: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204453.30669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204453.30684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204453.30838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204453.32659: stdout chunk (state=3): >>>/root <<< 32134 1727204453.32846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204453.32891: stdout chunk (state=3): >>><<< 32134 1727204453.32895: stderr chunk (state=3): >>><<< 32134 1727204453.32917: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204453.32937: _low_level_execute_command(): starting 32134 1727204453.33040: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204453.3292477-33570-35068172996320 `" && echo ansible-tmp-1727204453.3292477-33570-35068172996320="` echo /root/.ansible/tmp/ansible-tmp-1727204453.3292477-33570-35068172996320 `" ) && sleep 0' 32134 1727204453.33670: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204453.33684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204453.33722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204453.33841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204453.33915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204453.33930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204453.36241: stdout chunk (state=3): >>>ansible-tmp-1727204453.3292477-33570-35068172996320=/root/.ansible/tmp/ansible-tmp-1727204453.3292477-33570-35068172996320 <<< 32134 1727204453.36378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204453.36381: stdout chunk (state=3): >>><<< 32134 1727204453.36384: stderr chunk (state=3): >>><<< 32134 1727204453.36595: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204453.3292477-33570-35068172996320=/root/.ansible/tmp/ansible-tmp-1727204453.3292477-33570-35068172996320 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204453.36599: variable 'ansible_module_compression' from source: unknown 32134 1727204453.36601: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 32134 1727204453.36604: variable 'ansible_facts' from source: unknown 32134 1727204453.36788: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204453.3292477-33570-35068172996320/AnsiballZ_systemd.py 32134 1727204453.37062: Sending initial data 32134 1727204453.37073: Sent initial data (155 bytes) 32134 1727204453.37705: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204453.37750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204453.37763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204453.37773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204453.37848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204453.39550: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204453.39596: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204453.39656: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpt6bz7ji0 /root/.ansible/tmp/ansible-tmp-1727204453.3292477-33570-35068172996320/AnsiballZ_systemd.py <<< 32134 1727204453.39660: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204453.3292477-33570-35068172996320/AnsiballZ_systemd.py" <<< 32134 1727204453.39705: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpt6bz7ji0" to remote "/root/.ansible/tmp/ansible-tmp-1727204453.3292477-33570-35068172996320/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204453.3292477-33570-35068172996320/AnsiballZ_systemd.py" <<< 32134 1727204453.42658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204453.42661: stdout chunk (state=3): >>><<< 32134 1727204453.42664: stderr chunk (state=3): >>><<< 32134 1727204453.42666: done transferring module to remote 32134 1727204453.42669: _low_level_execute_command(): starting 32134 1727204453.42671: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204453.3292477-33570-35068172996320/ /root/.ansible/tmp/ansible-tmp-1727204453.3292477-33570-35068172996320/AnsiballZ_systemd.py && sleep 0' 32134 1727204453.43269: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204453.43288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204453.43350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204453.43435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204453.43475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204453.43507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204453.43556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204453.45698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204453.45702: stdout chunk (state=3): >>><<< 32134 1727204453.45705: stderr chunk (state=3): >>><<< 32134 1727204453.45707: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204453.45710: _low_level_execute_command(): starting 32134 1727204453.45715: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204453.3292477-33570-35068172996320/AnsiballZ_systemd.py && sleep 0' 32134 1727204453.46331: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204453.46364: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204453.46417: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204453.46463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204453.46483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204453.46556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204453.46638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204453.46695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204453.80218: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4468736", "MemoryAvailable": "infinity", "CPUUsageNSec": "1587841000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 32134 1727204453.80242: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 32134 1727204453.82210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204453.82273: stderr chunk (state=3): >>><<< 32134 1727204453.82276: stdout chunk (state=3): >>><<< 32134 1727204453.82299: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4468736", "MemoryAvailable": "infinity", "CPUUsageNSec": "1587841000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204453.82459: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204453.3292477-33570-35068172996320/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204453.82479: _low_level_execute_command(): starting 32134 1727204453.82482: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204453.3292477-33570-35068172996320/ > /dev/null 2>&1 && sleep 0' 32134 1727204453.82951: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204453.82954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204453.82957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32134 1727204453.82959: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204453.82961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204453.83008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204453.83012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204453.83060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204453.85031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204453.85080: stderr chunk (state=3): >>><<< 32134 1727204453.85084: stdout chunk (state=3): >>><<< 32134 1727204453.85103: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204453.85114: handler run complete 32134 1727204453.85163: attempt loop complete, returning result 32134 1727204453.85166: _execute() done 32134 1727204453.85168: dumping result to json 32134 1727204453.85183: done dumping result, returning 32134 1727204453.85192: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-753f-5162-000000000073] 32134 1727204453.85198: sending task result for task 12b410aa-8751-753f-5162-000000000073 32134 1727204453.85443: done sending task result for task 12b410aa-8751-753f-5162-000000000073 32134 1727204453.85446: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32134 1727204453.85506: no more pending results, returning what we have 32134 1727204453.85510: results queue empty 32134 1727204453.85511: checking for any_errors_fatal 32134 1727204453.85519: done checking for any_errors_fatal 32134 1727204453.85520: checking for max_fail_percentage 32134 1727204453.85522: done checking for max_fail_percentage 32134 1727204453.85523: checking to see if all hosts have failed and the running result is not ok 32134 1727204453.85525: done checking to see if all hosts have failed 32134 1727204453.85526: getting the remaining hosts for this loop 32134 1727204453.85527: done getting the remaining hosts for this loop 32134 1727204453.85531: getting the next task for host managed-node2 32134 1727204453.85538: done getting next task for host managed-node2 32134 1727204453.85542: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 32134 1727204453.85545: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204453.85563: getting variables 32134 1727204453.85565: in VariableManager get_vars() 32134 1727204453.85607: Calling all_inventory to load vars for managed-node2 32134 1727204453.85610: Calling groups_inventory to load vars for managed-node2 32134 1727204453.85613: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204453.85624: Calling all_plugins_play to load vars for managed-node2 32134 1727204453.85628: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204453.85631: Calling groups_plugins_play to load vars for managed-node2 32134 1727204453.90599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204453.92184: done with get_vars() 32134 1727204453.92209: done getting variables 32134 1727204453.92255: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:00:53 -0400 (0:00:00.855) 0:00:28.326 ***** 32134 1727204453.92276: entering _queue_task() for managed-node2/service 32134 1727204453.92554: worker is 1 (out of 1 available) 32134 1727204453.92568: exiting _queue_task() for managed-node2/service 32134 1727204453.92581: done queuing things up, now waiting for results queue to drain 32134 1727204453.92583: waiting for pending results... 32134 1727204453.92791: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 32134 1727204453.92879: in run() - task 12b410aa-8751-753f-5162-000000000074 32134 1727204453.92893: variable 'ansible_search_path' from source: unknown 32134 1727204453.92897: variable 'ansible_search_path' from source: unknown 32134 1727204453.92933: calling self._execute() 32134 1727204453.93018: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204453.93024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204453.93042: variable 'omit' from source: magic vars 32134 1727204453.93366: variable 'ansible_distribution_major_version' from source: facts 32134 1727204453.93378: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204453.93473: variable 'connection_failed' from source: set_fact 32134 1727204453.93485: Evaluated conditional (not connection_failed): True 32134 1727204453.93578: variable 'ansible_distribution_major_version' from source: facts 32134 1727204453.93582: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204453.93670: variable 'connection_failed' from source: set_fact 32134 1727204453.93674: Evaluated conditional (not connection_failed): True 32134 1727204453.93772: variable 'network_provider' from source: set_fact 32134 1727204453.93776: Evaluated conditional (network_provider == "nm"): True 32134 1727204453.93859: variable '__network_wpa_supplicant_required' from source: role '' defaults 32134 1727204453.93934: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32134 1727204453.94081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204453.95786: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204453.95847: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204453.95883: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204453.95919: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204453.95941: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204453.96015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204453.96038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204453.96059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204453.96099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204453.96114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204453.96152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204453.96171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204453.96202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204453.96232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204453.96244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204453.96278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204453.96300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204453.96326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204453.96357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204453.96369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204453.96485: variable 'network_connections' from source: play vars 32134 1727204453.96497: variable 'profile' from source: play vars 32134 1727204453.96560: variable 'profile' from source: play vars 32134 1727204453.96563: variable 'interface' from source: set_fact 32134 1727204453.96618: variable 'interface' from source: set_fact 32134 1727204453.96680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204453.96819: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204453.96858: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204453.96882: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204453.96908: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204453.96944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204453.97063: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204453.97073: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204453.97077: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204453.97080: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204453.97249: variable 'network_connections' from source: play vars 32134 1727204453.97261: variable 'profile' from source: play vars 32134 1727204453.97314: variable 'profile' from source: play vars 32134 1727204453.97318: variable 'interface' from source: set_fact 32134 1727204453.97366: variable 'interface' from source: set_fact 32134 1727204453.97391: Evaluated conditional (__network_wpa_supplicant_required): False 32134 1727204453.97396: when evaluation is False, skipping this task 32134 1727204453.97400: _execute() done 32134 1727204453.97409: dumping result to json 32134 1727204453.97414: done dumping result, returning 32134 1727204453.97417: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-753f-5162-000000000074] 32134 1727204453.97423: sending task result for task 12b410aa-8751-753f-5162-000000000074 32134 1727204453.97517: done sending task result for task 12b410aa-8751-753f-5162-000000000074 32134 1727204453.97520: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 32134 1727204453.97570: no more pending results, returning what we have 32134 1727204453.97574: results queue empty 32134 1727204453.97575: checking for any_errors_fatal 32134 1727204453.97602: done checking for any_errors_fatal 32134 1727204453.97603: checking for max_fail_percentage 32134 1727204453.97605: done checking for max_fail_percentage 32134 1727204453.97607: checking to see if all hosts have failed and the running result is not ok 32134 1727204453.97608: done checking to see if all hosts have failed 32134 1727204453.97608: getting the remaining hosts for this loop 32134 1727204453.97610: done getting the remaining hosts for this loop 32134 1727204453.97617: getting the next task for host managed-node2 32134 1727204453.97624: done getting next task for host managed-node2 32134 1727204453.97628: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 32134 1727204453.97631: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204453.97646: getting variables 32134 1727204453.97648: in VariableManager get_vars() 32134 1727204453.97687: Calling all_inventory to load vars for managed-node2 32134 1727204453.97698: Calling groups_inventory to load vars for managed-node2 32134 1727204453.97701: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204453.97715: Calling all_plugins_play to load vars for managed-node2 32134 1727204453.97718: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204453.97722: Calling groups_plugins_play to load vars for managed-node2 32134 1727204453.99097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204454.00706: done with get_vars() 32134 1727204454.00731: done getting variables 32134 1727204454.00784: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.085) 0:00:28.412 ***** 32134 1727204454.00812: entering _queue_task() for managed-node2/service 32134 1727204454.01073: worker is 1 (out of 1 available) 32134 1727204454.01091: exiting _queue_task() for managed-node2/service 32134 1727204454.01104: done queuing things up, now waiting for results queue to drain 32134 1727204454.01106: waiting for pending results... 32134 1727204454.01308: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 32134 1727204454.01396: in run() - task 12b410aa-8751-753f-5162-000000000075 32134 1727204454.01410: variable 'ansible_search_path' from source: unknown 32134 1727204454.01414: variable 'ansible_search_path' from source: unknown 32134 1727204454.01453: calling self._execute() 32134 1727204454.01540: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204454.01546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204454.01562: variable 'omit' from source: magic vars 32134 1727204454.01896: variable 'ansible_distribution_major_version' from source: facts 32134 1727204454.01908: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204454.02005: variable 'connection_failed' from source: set_fact 32134 1727204454.02009: Evaluated conditional (not connection_failed): True 32134 1727204454.02103: variable 'ansible_distribution_major_version' from source: facts 32134 1727204454.02108: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204454.02192: variable 'connection_failed' from source: set_fact 32134 1727204454.02196: Evaluated conditional (not connection_failed): True 32134 1727204454.02293: variable 'network_provider' from source: set_fact 32134 1727204454.02297: Evaluated conditional (network_provider == "initscripts"): False 32134 1727204454.02301: when evaluation is False, skipping this task 32134 1727204454.02306: _execute() done 32134 1727204454.02310: dumping result to json 32134 1727204454.02317: done dumping result, returning 32134 1727204454.02328: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-753f-5162-000000000075] 32134 1727204454.02331: sending task result for task 12b410aa-8751-753f-5162-000000000075 32134 1727204454.02430: done sending task result for task 12b410aa-8751-753f-5162-000000000075 32134 1727204454.02433: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32134 1727204454.02491: no more pending results, returning what we have 32134 1727204454.02495: results queue empty 32134 1727204454.02496: checking for any_errors_fatal 32134 1727204454.02505: done checking for any_errors_fatal 32134 1727204454.02506: checking for max_fail_percentage 32134 1727204454.02508: done checking for max_fail_percentage 32134 1727204454.02509: checking to see if all hosts have failed and the running result is not ok 32134 1727204454.02510: done checking to see if all hosts have failed 32134 1727204454.02511: getting the remaining hosts for this loop 32134 1727204454.02512: done getting the remaining hosts for this loop 32134 1727204454.02517: getting the next task for host managed-node2 32134 1727204454.02523: done getting next task for host managed-node2 32134 1727204454.02527: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 32134 1727204454.02530: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204454.02547: getting variables 32134 1727204454.02549: in VariableManager get_vars() 32134 1727204454.02587: Calling all_inventory to load vars for managed-node2 32134 1727204454.02592: Calling groups_inventory to load vars for managed-node2 32134 1727204454.02595: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204454.02604: Calling all_plugins_play to load vars for managed-node2 32134 1727204454.02607: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204454.02611: Calling groups_plugins_play to load vars for managed-node2 32134 1727204454.03835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204454.05465: done with get_vars() 32134 1727204454.05494: done getting variables 32134 1727204454.05549: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.047) 0:00:28.459 ***** 32134 1727204454.05574: entering _queue_task() for managed-node2/copy 32134 1727204454.05848: worker is 1 (out of 1 available) 32134 1727204454.05865: exiting _queue_task() for managed-node2/copy 32134 1727204454.05876: done queuing things up, now waiting for results queue to drain 32134 1727204454.05878: waiting for pending results... 32134 1727204454.06084: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 32134 1727204454.06175: in run() - task 12b410aa-8751-753f-5162-000000000076 32134 1727204454.06188: variable 'ansible_search_path' from source: unknown 32134 1727204454.06196: variable 'ansible_search_path' from source: unknown 32134 1727204454.06234: calling self._execute() 32134 1727204454.06322: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204454.06336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204454.06340: variable 'omit' from source: magic vars 32134 1727204454.06672: variable 'ansible_distribution_major_version' from source: facts 32134 1727204454.06683: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204454.06779: variable 'connection_failed' from source: set_fact 32134 1727204454.06783: Evaluated conditional (not connection_failed): True 32134 1727204454.06884: variable 'ansible_distribution_major_version' from source: facts 32134 1727204454.06887: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204454.06967: variable 'connection_failed' from source: set_fact 32134 1727204454.06971: Evaluated conditional (not connection_failed): True 32134 1727204454.07070: variable 'network_provider' from source: set_fact 32134 1727204454.07075: Evaluated conditional (network_provider == "initscripts"): False 32134 1727204454.07080: when evaluation is False, skipping this task 32134 1727204454.07083: _execute() done 32134 1727204454.07088: dumping result to json 32134 1727204454.07095: done dumping result, returning 32134 1727204454.07109: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-753f-5162-000000000076] 32134 1727204454.07116: sending task result for task 12b410aa-8751-753f-5162-000000000076 32134 1727204454.07218: done sending task result for task 12b410aa-8751-753f-5162-000000000076 32134 1727204454.07221: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 32134 1727204454.07270: no more pending results, returning what we have 32134 1727204454.07275: results queue empty 32134 1727204454.07276: checking for any_errors_fatal 32134 1727204454.07284: done checking for any_errors_fatal 32134 1727204454.07285: checking for max_fail_percentage 32134 1727204454.07287: done checking for max_fail_percentage 32134 1727204454.07288: checking to see if all hosts have failed and the running result is not ok 32134 1727204454.07292: done checking to see if all hosts have failed 32134 1727204454.07292: getting the remaining hosts for this loop 32134 1727204454.07294: done getting the remaining hosts for this loop 32134 1727204454.07299: getting the next task for host managed-node2 32134 1727204454.07305: done getting next task for host managed-node2 32134 1727204454.07309: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 32134 1727204454.07312: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204454.07328: getting variables 32134 1727204454.07330: in VariableManager get_vars() 32134 1727204454.07367: Calling all_inventory to load vars for managed-node2 32134 1727204454.07370: Calling groups_inventory to load vars for managed-node2 32134 1727204454.07372: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204454.07383: Calling all_plugins_play to load vars for managed-node2 32134 1727204454.07386: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204454.07396: Calling groups_plugins_play to load vars for managed-node2 32134 1727204454.08758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204454.10407: done with get_vars() 32134 1727204454.10429: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.049) 0:00:28.508 ***** 32134 1727204454.10498: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 32134 1727204454.10739: worker is 1 (out of 1 available) 32134 1727204454.10756: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 32134 1727204454.10769: done queuing things up, now waiting for results queue to drain 32134 1727204454.10770: waiting for pending results... 32134 1727204454.10969: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 32134 1727204454.11057: in run() - task 12b410aa-8751-753f-5162-000000000077 32134 1727204454.11070: variable 'ansible_search_path' from source: unknown 32134 1727204454.11073: variable 'ansible_search_path' from source: unknown 32134 1727204454.11235: calling self._execute() 32134 1727204454.11242: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204454.11246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204454.11249: variable 'omit' from source: magic vars 32134 1727204454.11895: variable 'ansible_distribution_major_version' from source: facts 32134 1727204454.11899: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204454.11903: variable 'connection_failed' from source: set_fact 32134 1727204454.11905: Evaluated conditional (not connection_failed): True 32134 1727204454.11999: variable 'ansible_distribution_major_version' from source: facts 32134 1727204454.12015: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204454.12142: variable 'connection_failed' from source: set_fact 32134 1727204454.12155: Evaluated conditional (not connection_failed): True 32134 1727204454.12171: variable 'omit' from source: magic vars 32134 1727204454.12227: variable 'omit' from source: magic vars 32134 1727204454.12441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204454.14225: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204454.14285: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204454.14321: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204454.14354: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204454.14377: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204454.14443: variable 'network_provider' from source: set_fact 32134 1727204454.14552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204454.14579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204454.14602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204454.14657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204454.14894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204454.14898: variable 'omit' from source: magic vars 32134 1727204454.14910: variable 'omit' from source: magic vars 32134 1727204454.15043: variable 'network_connections' from source: play vars 32134 1727204454.15060: variable 'profile' from source: play vars 32134 1727204454.15153: variable 'profile' from source: play vars 32134 1727204454.15164: variable 'interface' from source: set_fact 32134 1727204454.15244: variable 'interface' from source: set_fact 32134 1727204454.15408: variable 'omit' from source: magic vars 32134 1727204454.15427: variable '__lsr_ansible_managed' from source: task vars 32134 1727204454.15500: variable '__lsr_ansible_managed' from source: task vars 32134 1727204454.15846: Loaded config def from plugin (lookup/template) 32134 1727204454.15857: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 32134 1727204454.15896: File lookup term: get_ansible_managed.j2 32134 1727204454.15904: variable 'ansible_search_path' from source: unknown 32134 1727204454.15917: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 32134 1727204454.15937: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 32134 1727204454.15959: variable 'ansible_search_path' from source: unknown 32134 1727204454.26231: variable 'ansible_managed' from source: unknown 32134 1727204454.26479: variable 'omit' from source: magic vars 32134 1727204454.26527: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204454.26565: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204454.26594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204454.26624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204454.26642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204454.26679: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204454.26688: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204454.26702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204454.26831: Set connection var ansible_timeout to 10 32134 1727204454.26855: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204454.26864: Set connection var ansible_connection to ssh 32134 1727204454.26873: Set connection var ansible_shell_type to sh 32134 1727204454.26885: Set connection var ansible_shell_executable to /bin/sh 32134 1727204454.26901: Set connection var ansible_pipelining to False 32134 1727204454.26937: variable 'ansible_shell_executable' from source: unknown 32134 1727204454.26946: variable 'ansible_connection' from source: unknown 32134 1727204454.26955: variable 'ansible_module_compression' from source: unknown 32134 1727204454.26962: variable 'ansible_shell_type' from source: unknown 32134 1727204454.26969: variable 'ansible_shell_executable' from source: unknown 32134 1727204454.26986: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204454.27095: variable 'ansible_pipelining' from source: unknown 32134 1727204454.27098: variable 'ansible_timeout' from source: unknown 32134 1727204454.27101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204454.27184: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204454.27206: variable 'omit' from source: magic vars 32134 1727204454.27223: starting attempt loop 32134 1727204454.27232: running the handler 32134 1727204454.27253: _low_level_execute_command(): starting 32134 1727204454.27266: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204454.28016: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204454.28034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204454.28050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204454.28072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204454.28093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204454.28108: stderr chunk (state=3): >>>debug2: match not found <<< 32134 1727204454.28207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204454.28229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204454.28247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204454.28275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204454.28351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204454.30184: stdout chunk (state=3): >>>/root <<< 32134 1727204454.30373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204454.30389: stdout chunk (state=3): >>><<< 32134 1727204454.30409: stderr chunk (state=3): >>><<< 32134 1727204454.30437: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204454.30456: _low_level_execute_command(): starting 32134 1727204454.30468: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204454.3044384-33599-75365879354901 `" && echo ansible-tmp-1727204454.3044384-33599-75365879354901="` echo /root/.ansible/tmp/ansible-tmp-1727204454.3044384-33599-75365879354901 `" ) && sleep 0' 32134 1727204454.31145: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204454.31161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204454.31174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204454.31198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204454.31261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204454.31327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204454.31345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204454.31374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204454.31443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204454.33560: stdout chunk (state=3): >>>ansible-tmp-1727204454.3044384-33599-75365879354901=/root/.ansible/tmp/ansible-tmp-1727204454.3044384-33599-75365879354901 <<< 32134 1727204454.33676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204454.33800: stderr chunk (state=3): >>><<< 32134 1727204454.33832: stdout chunk (state=3): >>><<< 32134 1727204454.33855: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204454.3044384-33599-75365879354901=/root/.ansible/tmp/ansible-tmp-1727204454.3044384-33599-75365879354901 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204454.33942: variable 'ansible_module_compression' from source: unknown 32134 1727204454.33995: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 32134 1727204454.34197: variable 'ansible_facts' from source: unknown 32134 1727204454.34200: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204454.3044384-33599-75365879354901/AnsiballZ_network_connections.py 32134 1727204454.34385: Sending initial data 32134 1727204454.34425: Sent initial data (167 bytes) 32134 1727204454.35504: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204454.35536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204454.37265: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204454.37319: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204454.37383: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpdxap2a_r /root/.ansible/tmp/ansible-tmp-1727204454.3044384-33599-75365879354901/AnsiballZ_network_connections.py <<< 32134 1727204454.37402: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204454.3044384-33599-75365879354901/AnsiballZ_network_connections.py" <<< 32134 1727204454.37442: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpdxap2a_r" to remote "/root/.ansible/tmp/ansible-tmp-1727204454.3044384-33599-75365879354901/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204454.3044384-33599-75365879354901/AnsiballZ_network_connections.py" <<< 32134 1727204454.39072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204454.39125: stderr chunk (state=3): >>><<< 32134 1727204454.39136: stdout chunk (state=3): >>><<< 32134 1727204454.39169: done transferring module to remote 32134 1727204454.39188: _low_level_execute_command(): starting 32134 1727204454.39203: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204454.3044384-33599-75365879354901/ /root/.ansible/tmp/ansible-tmp-1727204454.3044384-33599-75365879354901/AnsiballZ_network_connections.py && sleep 0' 32134 1727204454.39944: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204454.39966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204454.39983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204454.40009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204454.40031: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204454.40045: stderr chunk (state=3): >>>debug2: match not found <<< 32134 1727204454.40074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204454.40125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204454.40326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204454.40354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204454.40370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204454.40538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204454.42704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204454.42707: stdout chunk (state=3): >>><<< 32134 1727204454.42710: stderr chunk (state=3): >>><<< 32134 1727204454.42715: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204454.42717: _low_level_execute_command(): starting 32134 1727204454.42720: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204454.3044384-33599-75365879354901/AnsiballZ_network_connections.py && sleep 0' 32134 1727204454.43336: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204454.43359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204454.43405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204454.43428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204454.43519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204454.43549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204454.43625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204454.74514: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 32134 1727204454.76542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204454.76605: stderr chunk (state=3): >>><<< 32134 1727204454.76609: stdout chunk (state=3): >>><<< 32134 1727204454.76630: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204454.76663: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204454.3044384-33599-75365879354901/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204454.76675: _low_level_execute_command(): starting 32134 1727204454.76681: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204454.3044384-33599-75365879354901/ > /dev/null 2>&1 && sleep 0' 32134 1727204454.77160: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204454.77169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204454.77192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204454.77196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204454.77257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204454.77265: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204454.77307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204454.79342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204454.79401: stderr chunk (state=3): >>><<< 32134 1727204454.79405: stdout chunk (state=3): >>><<< 32134 1727204454.79426: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204454.79432: handler run complete 32134 1727204454.79458: attempt loop complete, returning result 32134 1727204454.79461: _execute() done 32134 1727204454.79465: dumping result to json 32134 1727204454.79471: done dumping result, returning 32134 1727204454.79483: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-753f-5162-000000000077] 32134 1727204454.79488: sending task result for task 12b410aa-8751-753f-5162-000000000077 32134 1727204454.79607: done sending task result for task 12b410aa-8751-753f-5162-000000000077 32134 1727204454.79611: WORKER PROCESS EXITING ok: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: 32134 1727204454.79733: no more pending results, returning what we have 32134 1727204454.79737: results queue empty 32134 1727204454.79738: checking for any_errors_fatal 32134 1727204454.79747: done checking for any_errors_fatal 32134 1727204454.79748: checking for max_fail_percentage 32134 1727204454.79749: done checking for max_fail_percentage 32134 1727204454.79751: checking to see if all hosts have failed and the running result is not ok 32134 1727204454.79752: done checking to see if all hosts have failed 32134 1727204454.79753: getting the remaining hosts for this loop 32134 1727204454.79754: done getting the remaining hosts for this loop 32134 1727204454.79759: getting the next task for host managed-node2 32134 1727204454.79765: done getting next task for host managed-node2 32134 1727204454.79769: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 32134 1727204454.79771: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204454.79781: getting variables 32134 1727204454.79783: in VariableManager get_vars() 32134 1727204454.79834: Calling all_inventory to load vars for managed-node2 32134 1727204454.79838: Calling groups_inventory to load vars for managed-node2 32134 1727204454.79841: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204454.79851: Calling all_plugins_play to load vars for managed-node2 32134 1727204454.79855: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204454.79858: Calling groups_plugins_play to load vars for managed-node2 32134 1727204454.81347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204454.84336: done with get_vars() 32134 1727204454.84384: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.739) 0:00:29.248 ***** 32134 1727204454.84491: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 32134 1727204454.84876: worker is 1 (out of 1 available) 32134 1727204454.84893: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 32134 1727204454.84907: done queuing things up, now waiting for results queue to drain 32134 1727204454.84909: waiting for pending results... 32134 1727204454.85314: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 32134 1727204454.85378: in run() - task 12b410aa-8751-753f-5162-000000000078 32134 1727204454.85412: variable 'ansible_search_path' from source: unknown 32134 1727204454.85424: variable 'ansible_search_path' from source: unknown 32134 1727204454.85473: calling self._execute() 32134 1727204454.85601: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204454.85619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204454.85644: variable 'omit' from source: magic vars 32134 1727204454.86173: variable 'ansible_distribution_major_version' from source: facts 32134 1727204454.86177: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204454.86301: variable 'connection_failed' from source: set_fact 32134 1727204454.86305: Evaluated conditional (not connection_failed): True 32134 1727204454.86403: variable 'ansible_distribution_major_version' from source: facts 32134 1727204454.86409: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204454.86496: variable 'connection_failed' from source: set_fact 32134 1727204454.86500: Evaluated conditional (not connection_failed): True 32134 1727204454.86597: variable 'network_state' from source: role '' defaults 32134 1727204454.86609: Evaluated conditional (network_state != {}): False 32134 1727204454.86615: when evaluation is False, skipping this task 32134 1727204454.86618: _execute() done 32134 1727204454.86621: dumping result to json 32134 1727204454.86623: done dumping result, returning 32134 1727204454.86633: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-753f-5162-000000000078] 32134 1727204454.86639: sending task result for task 12b410aa-8751-753f-5162-000000000078 32134 1727204454.86739: done sending task result for task 12b410aa-8751-753f-5162-000000000078 32134 1727204454.86742: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32134 1727204454.86805: no more pending results, returning what we have 32134 1727204454.86810: results queue empty 32134 1727204454.86813: checking for any_errors_fatal 32134 1727204454.86826: done checking for any_errors_fatal 32134 1727204454.86827: checking for max_fail_percentage 32134 1727204454.86829: done checking for max_fail_percentage 32134 1727204454.86830: checking to see if all hosts have failed and the running result is not ok 32134 1727204454.86831: done checking to see if all hosts have failed 32134 1727204454.86832: getting the remaining hosts for this loop 32134 1727204454.86834: done getting the remaining hosts for this loop 32134 1727204454.86838: getting the next task for host managed-node2 32134 1727204454.86845: done getting next task for host managed-node2 32134 1727204454.86849: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 32134 1727204454.86859: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204454.86874: getting variables 32134 1727204454.86876: in VariableManager get_vars() 32134 1727204454.86919: Calling all_inventory to load vars for managed-node2 32134 1727204454.86922: Calling groups_inventory to load vars for managed-node2 32134 1727204454.86925: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204454.86935: Calling all_plugins_play to load vars for managed-node2 32134 1727204454.86938: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204454.86941: Calling groups_plugins_play to load vars for managed-node2 32134 1727204454.88219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204454.89855: done with get_vars() 32134 1727204454.89881: done getting variables 32134 1727204454.89939: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.054) 0:00:29.303 ***** 32134 1727204454.89965: entering _queue_task() for managed-node2/debug 32134 1727204454.90238: worker is 1 (out of 1 available) 32134 1727204454.90253: exiting _queue_task() for managed-node2/debug 32134 1727204454.90265: done queuing things up, now waiting for results queue to drain 32134 1727204454.90267: waiting for pending results... 32134 1727204454.90464: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 32134 1727204454.90554: in run() - task 12b410aa-8751-753f-5162-000000000079 32134 1727204454.90566: variable 'ansible_search_path' from source: unknown 32134 1727204454.90570: variable 'ansible_search_path' from source: unknown 32134 1727204454.90610: calling self._execute() 32134 1727204454.90688: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204454.90696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204454.90707: variable 'omit' from source: magic vars 32134 1727204454.91030: variable 'ansible_distribution_major_version' from source: facts 32134 1727204454.91046: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204454.91137: variable 'connection_failed' from source: set_fact 32134 1727204454.91142: Evaluated conditional (not connection_failed): True 32134 1727204454.91241: variable 'ansible_distribution_major_version' from source: facts 32134 1727204454.91245: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204454.91333: variable 'connection_failed' from source: set_fact 32134 1727204454.91337: Evaluated conditional (not connection_failed): True 32134 1727204454.91344: variable 'omit' from source: magic vars 32134 1727204454.91378: variable 'omit' from source: magic vars 32134 1727204454.91419: variable 'omit' from source: magic vars 32134 1727204454.91454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204454.91493: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204454.91512: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204454.91531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204454.91544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204454.91572: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204454.91576: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204454.91582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204454.91671: Set connection var ansible_timeout to 10 32134 1727204454.91683: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204454.91687: Set connection var ansible_connection to ssh 32134 1727204454.91692: Set connection var ansible_shell_type to sh 32134 1727204454.91701: Set connection var ansible_shell_executable to /bin/sh 32134 1727204454.91708: Set connection var ansible_pipelining to False 32134 1727204454.91731: variable 'ansible_shell_executable' from source: unknown 32134 1727204454.91734: variable 'ansible_connection' from source: unknown 32134 1727204454.91737: variable 'ansible_module_compression' from source: unknown 32134 1727204454.91740: variable 'ansible_shell_type' from source: unknown 32134 1727204454.91743: variable 'ansible_shell_executable' from source: unknown 32134 1727204454.91748: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204454.91754: variable 'ansible_pipelining' from source: unknown 32134 1727204454.91758: variable 'ansible_timeout' from source: unknown 32134 1727204454.91763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204454.91886: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204454.91900: variable 'omit' from source: magic vars 32134 1727204454.91905: starting attempt loop 32134 1727204454.91908: running the handler 32134 1727204454.92022: variable '__network_connections_result' from source: set_fact 32134 1727204454.92071: handler run complete 32134 1727204454.92087: attempt loop complete, returning result 32134 1727204454.92092: _execute() done 32134 1727204454.92095: dumping result to json 32134 1727204454.92100: done dumping result, returning 32134 1727204454.92109: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-753f-5162-000000000079] 32134 1727204454.92116: sending task result for task 12b410aa-8751-753f-5162-000000000079 32134 1727204454.92207: done sending task result for task 12b410aa-8751-753f-5162-000000000079 32134 1727204454.92210: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 32134 1727204454.92312: no more pending results, returning what we have 32134 1727204454.92316: results queue empty 32134 1727204454.92317: checking for any_errors_fatal 32134 1727204454.92324: done checking for any_errors_fatal 32134 1727204454.92325: checking for max_fail_percentage 32134 1727204454.92327: done checking for max_fail_percentage 32134 1727204454.92328: checking to see if all hosts have failed and the running result is not ok 32134 1727204454.92329: done checking to see if all hosts have failed 32134 1727204454.92330: getting the remaining hosts for this loop 32134 1727204454.92331: done getting the remaining hosts for this loop 32134 1727204454.92335: getting the next task for host managed-node2 32134 1727204454.92341: done getting next task for host managed-node2 32134 1727204454.92345: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 32134 1727204454.92347: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204454.92358: getting variables 32134 1727204454.92360: in VariableManager get_vars() 32134 1727204454.92398: Calling all_inventory to load vars for managed-node2 32134 1727204454.92402: Calling groups_inventory to load vars for managed-node2 32134 1727204454.92404: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204454.92413: Calling all_plugins_play to load vars for managed-node2 32134 1727204454.92417: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204454.92420: Calling groups_plugins_play to load vars for managed-node2 32134 1727204454.93786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204454.95384: done with get_vars() 32134 1727204454.95413: done getting variables 32134 1727204454.95463: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.055) 0:00:29.358 ***** 32134 1727204454.95488: entering _queue_task() for managed-node2/debug 32134 1727204454.95757: worker is 1 (out of 1 available) 32134 1727204454.95773: exiting _queue_task() for managed-node2/debug 32134 1727204454.95785: done queuing things up, now waiting for results queue to drain 32134 1727204454.95787: waiting for pending results... 32134 1727204454.95986: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 32134 1727204454.96079: in run() - task 12b410aa-8751-753f-5162-00000000007a 32134 1727204454.96094: variable 'ansible_search_path' from source: unknown 32134 1727204454.96097: variable 'ansible_search_path' from source: unknown 32134 1727204454.96137: calling self._execute() 32134 1727204454.96224: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204454.96236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204454.96244: variable 'omit' from source: magic vars 32134 1727204454.96581: variable 'ansible_distribution_major_version' from source: facts 32134 1727204454.96593: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204454.96693: variable 'connection_failed' from source: set_fact 32134 1727204454.96698: Evaluated conditional (not connection_failed): True 32134 1727204454.96796: variable 'ansible_distribution_major_version' from source: facts 32134 1727204454.96800: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204454.96884: variable 'connection_failed' from source: set_fact 32134 1727204454.96889: Evaluated conditional (not connection_failed): True 32134 1727204454.96903: variable 'omit' from source: magic vars 32134 1727204454.96933: variable 'omit' from source: magic vars 32134 1727204454.96963: variable 'omit' from source: magic vars 32134 1727204454.97001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204454.97035: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204454.97054: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204454.97070: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204454.97082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204454.97112: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204454.97116: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204454.97127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204454.97207: Set connection var ansible_timeout to 10 32134 1727204454.97222: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204454.97227: Set connection var ansible_connection to ssh 32134 1727204454.97230: Set connection var ansible_shell_type to sh 32134 1727204454.97241: Set connection var ansible_shell_executable to /bin/sh 32134 1727204454.97243: Set connection var ansible_pipelining to False 32134 1727204454.97262: variable 'ansible_shell_executable' from source: unknown 32134 1727204454.97265: variable 'ansible_connection' from source: unknown 32134 1727204454.97268: variable 'ansible_module_compression' from source: unknown 32134 1727204454.97272: variable 'ansible_shell_type' from source: unknown 32134 1727204454.97275: variable 'ansible_shell_executable' from source: unknown 32134 1727204454.97280: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204454.97285: variable 'ansible_pipelining' from source: unknown 32134 1727204454.97288: variable 'ansible_timeout' from source: unknown 32134 1727204454.97295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204454.97419: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204454.97429: variable 'omit' from source: magic vars 32134 1727204454.97435: starting attempt loop 32134 1727204454.97439: running the handler 32134 1727204454.97485: variable '__network_connections_result' from source: set_fact 32134 1727204454.97550: variable '__network_connections_result' from source: set_fact 32134 1727204454.97654: handler run complete 32134 1727204454.97700: attempt loop complete, returning result 32134 1727204454.97704: _execute() done 32134 1727204454.97707: dumping result to json 32134 1727204454.97709: done dumping result, returning 32134 1727204454.97712: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-753f-5162-00000000007a] 32134 1727204454.97714: sending task result for task 12b410aa-8751-753f-5162-00000000007a 32134 1727204454.97814: done sending task result for task 12b410aa-8751-753f-5162-00000000007a 32134 1727204454.97817: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 32134 1727204454.97907: no more pending results, returning what we have 32134 1727204454.97910: results queue empty 32134 1727204454.97912: checking for any_errors_fatal 32134 1727204454.97918: done checking for any_errors_fatal 32134 1727204454.97919: checking for max_fail_percentage 32134 1727204454.97921: done checking for max_fail_percentage 32134 1727204454.97922: checking to see if all hosts have failed and the running result is not ok 32134 1727204454.97923: done checking to see if all hosts have failed 32134 1727204454.97924: getting the remaining hosts for this loop 32134 1727204454.97925: done getting the remaining hosts for this loop 32134 1727204454.97930: getting the next task for host managed-node2 32134 1727204454.97936: done getting next task for host managed-node2 32134 1727204454.97940: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 32134 1727204454.97943: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204454.97953: getting variables 32134 1727204454.97955: in VariableManager get_vars() 32134 1727204454.97998: Calling all_inventory to load vars for managed-node2 32134 1727204454.98001: Calling groups_inventory to load vars for managed-node2 32134 1727204454.98004: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204454.98014: Calling all_plugins_play to load vars for managed-node2 32134 1727204454.98017: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204454.98021: Calling groups_plugins_play to load vars for managed-node2 32134 1727204454.99355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204455.01633: done with get_vars() 32134 1727204455.01661: done getting variables 32134 1727204455.01715: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.062) 0:00:29.421 ***** 32134 1727204455.01747: entering _queue_task() for managed-node2/debug 32134 1727204455.02010: worker is 1 (out of 1 available) 32134 1727204455.02025: exiting _queue_task() for managed-node2/debug 32134 1727204455.02038: done queuing things up, now waiting for results queue to drain 32134 1727204455.02040: waiting for pending results... 32134 1727204455.02243: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 32134 1727204455.02335: in run() - task 12b410aa-8751-753f-5162-00000000007b 32134 1727204455.02349: variable 'ansible_search_path' from source: unknown 32134 1727204455.02352: variable 'ansible_search_path' from source: unknown 32134 1727204455.02387: calling self._execute() 32134 1727204455.02481: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204455.02493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204455.02504: variable 'omit' from source: magic vars 32134 1727204455.02831: variable 'ansible_distribution_major_version' from source: facts 32134 1727204455.02842: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204455.02940: variable 'connection_failed' from source: set_fact 32134 1727204455.02943: Evaluated conditional (not connection_failed): True 32134 1727204455.03195: variable 'ansible_distribution_major_version' from source: facts 32134 1727204455.03199: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204455.03215: variable 'connection_failed' from source: set_fact 32134 1727204455.03227: Evaluated conditional (not connection_failed): True 32134 1727204455.03374: variable 'network_state' from source: role '' defaults 32134 1727204455.03394: Evaluated conditional (network_state != {}): False 32134 1727204455.03404: when evaluation is False, skipping this task 32134 1727204455.03412: _execute() done 32134 1727204455.03421: dumping result to json 32134 1727204455.03430: done dumping result, returning 32134 1727204455.03443: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-753f-5162-00000000007b] 32134 1727204455.03454: sending task result for task 12b410aa-8751-753f-5162-00000000007b skipping: [managed-node2] => { "false_condition": "network_state != {}" } 32134 1727204455.03645: no more pending results, returning what we have 32134 1727204455.03652: results queue empty 32134 1727204455.03653: checking for any_errors_fatal 32134 1727204455.03663: done checking for any_errors_fatal 32134 1727204455.03664: checking for max_fail_percentage 32134 1727204455.03666: done checking for max_fail_percentage 32134 1727204455.03667: checking to see if all hosts have failed and the running result is not ok 32134 1727204455.03668: done checking to see if all hosts have failed 32134 1727204455.03669: getting the remaining hosts for this loop 32134 1727204455.03670: done getting the remaining hosts for this loop 32134 1727204455.03674: getting the next task for host managed-node2 32134 1727204455.03681: done getting next task for host managed-node2 32134 1727204455.03685: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 32134 1727204455.03688: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204455.03714: getting variables 32134 1727204455.03716: in VariableManager get_vars() 32134 1727204455.03756: Calling all_inventory to load vars for managed-node2 32134 1727204455.03759: Calling groups_inventory to load vars for managed-node2 32134 1727204455.03762: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204455.03777: Calling all_plugins_play to load vars for managed-node2 32134 1727204455.03780: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204455.03785: Calling groups_plugins_play to load vars for managed-node2 32134 1727204455.04526: done sending task result for task 12b410aa-8751-753f-5162-00000000007b 32134 1727204455.04529: WORKER PROCESS EXITING 32134 1727204455.06367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204455.09424: done with get_vars() 32134 1727204455.09468: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.078) 0:00:29.499 ***** 32134 1727204455.09583: entering _queue_task() for managed-node2/ping 32134 1727204455.09966: worker is 1 (out of 1 available) 32134 1727204455.09980: exiting _queue_task() for managed-node2/ping 32134 1727204455.09993: done queuing things up, now waiting for results queue to drain 32134 1727204455.09995: waiting for pending results... 32134 1727204455.10414: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 32134 1727204455.10443: in run() - task 12b410aa-8751-753f-5162-00000000007c 32134 1727204455.10468: variable 'ansible_search_path' from source: unknown 32134 1727204455.10476: variable 'ansible_search_path' from source: unknown 32134 1727204455.10532: calling self._execute() 32134 1727204455.10661: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204455.10677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204455.10698: variable 'omit' from source: magic vars 32134 1727204455.11182: variable 'ansible_distribution_major_version' from source: facts 32134 1727204455.11203: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204455.11350: variable 'connection_failed' from source: set_fact 32134 1727204455.11364: Evaluated conditional (not connection_failed): True 32134 1727204455.11521: variable 'ansible_distribution_major_version' from source: facts 32134 1727204455.11534: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204455.11664: variable 'connection_failed' from source: set_fact 32134 1727204455.11676: Evaluated conditional (not connection_failed): True 32134 1727204455.11691: variable 'omit' from source: magic vars 32134 1727204455.11750: variable 'omit' from source: magic vars 32134 1727204455.11801: variable 'omit' from source: magic vars 32134 1727204455.11855: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204455.11904: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204455.11938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204455.11966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204455.11987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204455.12033: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204455.12095: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204455.12098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204455.12184: Set connection var ansible_timeout to 10 32134 1727204455.12208: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204455.12217: Set connection var ansible_connection to ssh 32134 1727204455.12223: Set connection var ansible_shell_type to sh 32134 1727204455.12232: Set connection var ansible_shell_executable to /bin/sh 32134 1727204455.12240: Set connection var ansible_pipelining to False 32134 1727204455.12270: variable 'ansible_shell_executable' from source: unknown 32134 1727204455.12277: variable 'ansible_connection' from source: unknown 32134 1727204455.12283: variable 'ansible_module_compression' from source: unknown 32134 1727204455.12290: variable 'ansible_shell_type' from source: unknown 32134 1727204455.12298: variable 'ansible_shell_executable' from source: unknown 32134 1727204455.12358: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204455.12361: variable 'ansible_pipelining' from source: unknown 32134 1727204455.12363: variable 'ansible_timeout' from source: unknown 32134 1727204455.12365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204455.12578: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204455.12601: variable 'omit' from source: magic vars 32134 1727204455.12611: starting attempt loop 32134 1727204455.12619: running the handler 32134 1727204455.12637: _low_level_execute_command(): starting 32134 1727204455.12649: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204455.13416: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204455.13434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204455.13568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204455.13572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204455.13597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204455.13677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204455.15467: stdout chunk (state=3): >>>/root <<< 32134 1727204455.15674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204455.15677: stdout chunk (state=3): >>><<< 32134 1727204455.15680: stderr chunk (state=3): >>><<< 32134 1727204455.15810: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204455.15814: _low_level_execute_command(): starting 32134 1727204455.15817: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204455.1570783-33630-26028103877998 `" && echo ansible-tmp-1727204455.1570783-33630-26028103877998="` echo /root/.ansible/tmp/ansible-tmp-1727204455.1570783-33630-26028103877998 `" ) && sleep 0' 32134 1727204455.16385: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204455.16448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204455.16531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204455.16562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204455.16582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204455.16650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204455.18935: stdout chunk (state=3): >>>ansible-tmp-1727204455.1570783-33630-26028103877998=/root/.ansible/tmp/ansible-tmp-1727204455.1570783-33630-26028103877998 <<< 32134 1727204455.18950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204455.18960: stdout chunk (state=3): >>><<< 32134 1727204455.18977: stderr chunk (state=3): >>><<< 32134 1727204455.19195: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204455.1570783-33630-26028103877998=/root/.ansible/tmp/ansible-tmp-1727204455.1570783-33630-26028103877998 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204455.19200: variable 'ansible_module_compression' from source: unknown 32134 1727204455.19202: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 32134 1727204455.19204: variable 'ansible_facts' from source: unknown 32134 1727204455.19245: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204455.1570783-33630-26028103877998/AnsiballZ_ping.py 32134 1727204455.19502: Sending initial data 32134 1727204455.19514: Sent initial data (152 bytes) 32134 1727204455.20049: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204455.20065: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204455.20080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204455.20197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204455.20223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204455.20299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204455.22021: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 32134 1727204455.22036: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 32134 1727204455.22055: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204455.22118: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204455.22159: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpvigazn9q /root/.ansible/tmp/ansible-tmp-1727204455.1570783-33630-26028103877998/AnsiballZ_ping.py <<< 32134 1727204455.22196: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204455.1570783-33630-26028103877998/AnsiballZ_ping.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpvigazn9q" to remote "/root/.ansible/tmp/ansible-tmp-1727204455.1570783-33630-26028103877998/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204455.1570783-33630-26028103877998/AnsiballZ_ping.py" <<< 32134 1727204455.23220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204455.23352: stderr chunk (state=3): >>><<< 32134 1727204455.23367: stdout chunk (state=3): >>><<< 32134 1727204455.23417: done transferring module to remote 32134 1727204455.23437: _low_level_execute_command(): starting 32134 1727204455.23448: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204455.1570783-33630-26028103877998/ /root/.ansible/tmp/ansible-tmp-1727204455.1570783-33630-26028103877998/AnsiballZ_ping.py && sleep 0' 32134 1727204455.24169: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204455.24184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204455.24318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204455.24404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204455.24408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204455.26501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204455.26504: stdout chunk (state=3): >>><<< 32134 1727204455.26507: stderr chunk (state=3): >>><<< 32134 1727204455.26510: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204455.26520: _low_level_execute_command(): starting 32134 1727204455.26523: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204455.1570783-33630-26028103877998/AnsiballZ_ping.py && sleep 0' 32134 1727204455.27194: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204455.27210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204455.27232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204455.27359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204455.27382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204455.27473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204455.45006: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 32134 1727204455.46700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204455.46704: stdout chunk (state=3): >>><<< 32134 1727204455.46707: stderr chunk (state=3): >>><<< 32134 1727204455.46710: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204455.46715: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204455.1570783-33630-26028103877998/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204455.46718: _low_level_execute_command(): starting 32134 1727204455.46720: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204455.1570783-33630-26028103877998/ > /dev/null 2>&1 && sleep 0' 32134 1727204455.47303: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204455.47324: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204455.47340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204455.47361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204455.47380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204455.47397: stderr chunk (state=3): >>>debug2: match not found <<< 32134 1727204455.47417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204455.47437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32134 1727204455.47510: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204455.47552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204455.47570: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204455.47594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204455.47667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204455.49610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204455.49666: stderr chunk (state=3): >>><<< 32134 1727204455.49671: stdout chunk (state=3): >>><<< 32134 1727204455.49688: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204455.49698: handler run complete 32134 1727204455.49717: attempt loop complete, returning result 32134 1727204455.49720: _execute() done 32134 1727204455.49722: dumping result to json 32134 1727204455.49728: done dumping result, returning 32134 1727204455.49737: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-753f-5162-00000000007c] 32134 1727204455.49742: sending task result for task 12b410aa-8751-753f-5162-00000000007c 32134 1727204455.49842: done sending task result for task 12b410aa-8751-753f-5162-00000000007c 32134 1727204455.49845: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 32134 1727204455.49917: no more pending results, returning what we have 32134 1727204455.49921: results queue empty 32134 1727204455.49922: checking for any_errors_fatal 32134 1727204455.49932: done checking for any_errors_fatal 32134 1727204455.49933: checking for max_fail_percentage 32134 1727204455.49934: done checking for max_fail_percentage 32134 1727204455.49936: checking to see if all hosts have failed and the running result is not ok 32134 1727204455.49937: done checking to see if all hosts have failed 32134 1727204455.49938: getting the remaining hosts for this loop 32134 1727204455.49939: done getting the remaining hosts for this loop 32134 1727204455.49945: getting the next task for host managed-node2 32134 1727204455.49960: done getting next task for host managed-node2 32134 1727204455.49963: ^ task is: TASK: meta (role_complete) 32134 1727204455.49965: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204455.49976: getting variables 32134 1727204455.49977: in VariableManager get_vars() 32134 1727204455.50021: Calling all_inventory to load vars for managed-node2 32134 1727204455.50025: Calling groups_inventory to load vars for managed-node2 32134 1727204455.50027: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204455.50038: Calling all_plugins_play to load vars for managed-node2 32134 1727204455.50041: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204455.50044: Calling groups_plugins_play to load vars for managed-node2 32134 1727204455.51479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204455.53083: done with get_vars() 32134 1727204455.53113: done getting variables 32134 1727204455.53180: done queuing things up, now waiting for results queue to drain 32134 1727204455.53182: results queue empty 32134 1727204455.53182: checking for any_errors_fatal 32134 1727204455.53185: done checking for any_errors_fatal 32134 1727204455.53185: checking for max_fail_percentage 32134 1727204455.53186: done checking for max_fail_percentage 32134 1727204455.53187: checking to see if all hosts have failed and the running result is not ok 32134 1727204455.53187: done checking to see if all hosts have failed 32134 1727204455.53188: getting the remaining hosts for this loop 32134 1727204455.53190: done getting the remaining hosts for this loop 32134 1727204455.53193: getting the next task for host managed-node2 32134 1727204455.53196: done getting next task for host managed-node2 32134 1727204455.53197: ^ task is: TASK: meta (flush_handlers) 32134 1727204455.53199: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204455.53201: getting variables 32134 1727204455.53203: in VariableManager get_vars() 32134 1727204455.53214: Calling all_inventory to load vars for managed-node2 32134 1727204455.53217: Calling groups_inventory to load vars for managed-node2 32134 1727204455.53218: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204455.53223: Calling all_plugins_play to load vars for managed-node2 32134 1727204455.53225: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204455.53227: Calling groups_plugins_play to load vars for managed-node2 32134 1727204455.54414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204455.56005: done with get_vars() 32134 1727204455.56032: done getting variables 32134 1727204455.56075: in VariableManager get_vars() 32134 1727204455.56085: Calling all_inventory to load vars for managed-node2 32134 1727204455.56087: Calling groups_inventory to load vars for managed-node2 32134 1727204455.56091: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204455.56095: Calling all_plugins_play to load vars for managed-node2 32134 1727204455.56097: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204455.56099: Calling groups_plugins_play to load vars for managed-node2 32134 1727204455.57181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204455.58776: done with get_vars() 32134 1727204455.58806: done queuing things up, now waiting for results queue to drain 32134 1727204455.58808: results queue empty 32134 1727204455.58809: checking for any_errors_fatal 32134 1727204455.58810: done checking for any_errors_fatal 32134 1727204455.58810: checking for max_fail_percentage 32134 1727204455.58812: done checking for max_fail_percentage 32134 1727204455.58812: checking to see if all hosts have failed and the running result is not ok 32134 1727204455.58813: done checking to see if all hosts have failed 32134 1727204455.58814: getting the remaining hosts for this loop 32134 1727204455.58815: done getting the remaining hosts for this loop 32134 1727204455.58817: getting the next task for host managed-node2 32134 1727204455.58820: done getting next task for host managed-node2 32134 1727204455.58821: ^ task is: TASK: meta (flush_handlers) 32134 1727204455.58823: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204455.58825: getting variables 32134 1727204455.58827: in VariableManager get_vars() 32134 1727204455.58837: Calling all_inventory to load vars for managed-node2 32134 1727204455.58840: Calling groups_inventory to load vars for managed-node2 32134 1727204455.58841: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204455.58846: Calling all_plugins_play to load vars for managed-node2 32134 1727204455.58848: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204455.58850: Calling groups_plugins_play to load vars for managed-node2 32134 1727204455.59985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204455.61569: done with get_vars() 32134 1727204455.61597: done getting variables 32134 1727204455.61641: in VariableManager get_vars() 32134 1727204455.61654: Calling all_inventory to load vars for managed-node2 32134 1727204455.61656: Calling groups_inventory to load vars for managed-node2 32134 1727204455.61658: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204455.61662: Calling all_plugins_play to load vars for managed-node2 32134 1727204455.61664: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204455.61667: Calling groups_plugins_play to load vars for managed-node2 32134 1727204455.62835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204455.64423: done with get_vars() 32134 1727204455.64452: done queuing things up, now waiting for results queue to drain 32134 1727204455.64454: results queue empty 32134 1727204455.64455: checking for any_errors_fatal 32134 1727204455.64456: done checking for any_errors_fatal 32134 1727204455.64457: checking for max_fail_percentage 32134 1727204455.64459: done checking for max_fail_percentage 32134 1727204455.64459: checking to see if all hosts have failed and the running result is not ok 32134 1727204455.64465: done checking to see if all hosts have failed 32134 1727204455.64465: getting the remaining hosts for this loop 32134 1727204455.64466: done getting the remaining hosts for this loop 32134 1727204455.64470: getting the next task for host managed-node2 32134 1727204455.64473: done getting next task for host managed-node2 32134 1727204455.64474: ^ task is: None 32134 1727204455.64475: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204455.64476: done queuing things up, now waiting for results queue to drain 32134 1727204455.64477: results queue empty 32134 1727204455.64477: checking for any_errors_fatal 32134 1727204455.64478: done checking for any_errors_fatal 32134 1727204455.64478: checking for max_fail_percentage 32134 1727204455.64479: done checking for max_fail_percentage 32134 1727204455.64480: checking to see if all hosts have failed and the running result is not ok 32134 1727204455.64480: done checking to see if all hosts have failed 32134 1727204455.64481: getting the next task for host managed-node2 32134 1727204455.64483: done getting next task for host managed-node2 32134 1727204455.64483: ^ task is: None 32134 1727204455.64484: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204455.64528: in VariableManager get_vars() 32134 1727204455.64546: done with get_vars() 32134 1727204455.64550: in VariableManager get_vars() 32134 1727204455.64560: done with get_vars() 32134 1727204455.64563: variable 'omit' from source: magic vars 32134 1727204455.64665: variable 'profile' from source: play vars 32134 1727204455.64751: in VariableManager get_vars() 32134 1727204455.64763: done with get_vars() 32134 1727204455.64780: variable 'omit' from source: magic vars 32134 1727204455.64844: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 32134 1727204455.65403: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 32134 1727204455.65427: getting the remaining hosts for this loop 32134 1727204455.65429: done getting the remaining hosts for this loop 32134 1727204455.65431: getting the next task for host managed-node2 32134 1727204455.65433: done getting next task for host managed-node2 32134 1727204455.65435: ^ task is: TASK: Gathering Facts 32134 1727204455.65437: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204455.65439: getting variables 32134 1727204455.65440: in VariableManager get_vars() 32134 1727204455.65450: Calling all_inventory to load vars for managed-node2 32134 1727204455.65452: Calling groups_inventory to load vars for managed-node2 32134 1727204455.65454: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204455.65459: Calling all_plugins_play to load vars for managed-node2 32134 1727204455.65460: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204455.65463: Calling groups_plugins_play to load vars for managed-node2 32134 1727204455.66917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204455.68796: done with get_vars() 32134 1727204455.68824: done getting variables 32134 1727204455.68864: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.593) 0:00:30.092 ***** 32134 1727204455.68888: entering _queue_task() for managed-node2/gather_facts 32134 1727204455.69160: worker is 1 (out of 1 available) 32134 1727204455.69172: exiting _queue_task() for managed-node2/gather_facts 32134 1727204455.69184: done queuing things up, now waiting for results queue to drain 32134 1727204455.69186: waiting for pending results... 32134 1727204455.69379: running TaskExecutor() for managed-node2/TASK: Gathering Facts 32134 1727204455.69696: in run() - task 12b410aa-8751-753f-5162-000000000521 32134 1727204455.69700: variable 'ansible_search_path' from source: unknown 32134 1727204455.69703: calling self._execute() 32134 1727204455.69706: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204455.69709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204455.69713: variable 'omit' from source: magic vars 32134 1727204455.70155: variable 'ansible_distribution_major_version' from source: facts 32134 1727204455.70172: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204455.70182: variable 'omit' from source: magic vars 32134 1727204455.70230: variable 'omit' from source: magic vars 32134 1727204455.70280: variable 'omit' from source: magic vars 32134 1727204455.70336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204455.70380: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204455.70413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204455.70441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204455.70460: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204455.70502: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204455.70514: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204455.70523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204455.70648: Set connection var ansible_timeout to 10 32134 1727204455.70674: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204455.70681: Set connection var ansible_connection to ssh 32134 1727204455.70688: Set connection var ansible_shell_type to sh 32134 1727204455.70702: Set connection var ansible_shell_executable to /bin/sh 32134 1727204455.70716: Set connection var ansible_pipelining to False 32134 1727204455.70747: variable 'ansible_shell_executable' from source: unknown 32134 1727204455.70756: variable 'ansible_connection' from source: unknown 32134 1727204455.70763: variable 'ansible_module_compression' from source: unknown 32134 1727204455.70771: variable 'ansible_shell_type' from source: unknown 32134 1727204455.70777: variable 'ansible_shell_executable' from source: unknown 32134 1727204455.70786: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204455.70796: variable 'ansible_pipelining' from source: unknown 32134 1727204455.70803: variable 'ansible_timeout' from source: unknown 32134 1727204455.70814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204455.71025: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204455.71044: variable 'omit' from source: magic vars 32134 1727204455.71054: starting attempt loop 32134 1727204455.71061: running the handler 32134 1727204455.71084: variable 'ansible_facts' from source: unknown 32134 1727204455.71113: _low_level_execute_command(): starting 32134 1727204455.71129: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204455.71879: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204455.71901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204455.71920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204455.71939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204455.71960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204455.71975: stderr chunk (state=3): >>>debug2: match not found <<< 32134 1727204455.72084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204455.72115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204455.72193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204455.73980: stdout chunk (state=3): >>>/root <<< 32134 1727204455.74164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204455.74191: stdout chunk (state=3): >>><<< 32134 1727204455.74206: stderr chunk (state=3): >>><<< 32134 1727204455.74237: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204455.74261: _low_level_execute_command(): starting 32134 1727204455.74274: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204455.7424555-33651-128695671363160 `" && echo ansible-tmp-1727204455.7424555-33651-128695671363160="` echo /root/.ansible/tmp/ansible-tmp-1727204455.7424555-33651-128695671363160 `" ) && sleep 0' 32134 1727204455.75013: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204455.75050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204455.75070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204455.75091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204455.75173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204455.77225: stdout chunk (state=3): >>>ansible-tmp-1727204455.7424555-33651-128695671363160=/root/.ansible/tmp/ansible-tmp-1727204455.7424555-33651-128695671363160 <<< 32134 1727204455.77495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204455.77499: stdout chunk (state=3): >>><<< 32134 1727204455.77501: stderr chunk (state=3): >>><<< 32134 1727204455.77504: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204455.7424555-33651-128695671363160=/root/.ansible/tmp/ansible-tmp-1727204455.7424555-33651-128695671363160 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204455.77507: variable 'ansible_module_compression' from source: unknown 32134 1727204455.77568: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 32134 1727204455.77647: variable 'ansible_facts' from source: unknown 32134 1727204455.77865: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204455.7424555-33651-128695671363160/AnsiballZ_setup.py 32134 1727204455.78080: Sending initial data 32134 1727204455.78092: Sent initial data (154 bytes) 32134 1727204455.78806: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204455.78822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204455.78841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204455.78864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204455.78940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204455.80656: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204455.80715: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204455.80755: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmp9ejl_798 /root/.ansible/tmp/ansible-tmp-1727204455.7424555-33651-128695671363160/AnsiballZ_setup.py <<< 32134 1727204455.80779: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204455.7424555-33651-128695671363160/AnsiballZ_setup.py" <<< 32134 1727204455.80798: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmp9ejl_798" to remote "/root/.ansible/tmp/ansible-tmp-1727204455.7424555-33651-128695671363160/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204455.7424555-33651-128695671363160/AnsiballZ_setup.py" <<< 32134 1727204455.83265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204455.83295: stdout chunk (state=3): >>><<< 32134 1727204455.83299: stderr chunk (state=3): >>><<< 32134 1727204455.83330: done transferring module to remote 32134 1727204455.83448: _low_level_execute_command(): starting 32134 1727204455.83451: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204455.7424555-33651-128695671363160/ /root/.ansible/tmp/ansible-tmp-1727204455.7424555-33651-128695671363160/AnsiballZ_setup.py && sleep 0' 32134 1727204455.84115: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204455.84217: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204455.84259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204455.84359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204455.89750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204455.89779: stderr chunk (state=3): >>><<< 32134 1727204455.89783: stdout chunk (state=3): >>><<< 32134 1727204455.89902: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204455.89919: _low_level_execute_command(): starting 32134 1727204455.89922: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204455.7424555-33651-128695671363160/AnsiballZ_setup.py && sleep 0' 32134 1727204455.90323: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204455.90339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204455.90391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204455.90395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204455.90463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204456.63558: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_R<<< 32134 1727204456.63602: stdout chunk (state=3): >>>ANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.67333984375, "5m": 0.6875, "15m": 0.46923828125}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "56", "epoch": "1727204456", "epoch_int": "1727204456", "date": "2024-09-24", "time": "15:00:56", "iso8601_micro": "2024-09-24T19:00:56.225166Z", "iso8601": "2024-09-24T19:00:56Z", "iso8601_basic": "20240924T150056225166", "iso8601_basic_short": "20240924T150056", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2826, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 891, "free": 2826}, "nocache": {"free": 3464, "used": 253}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "<<< 32134 1727204456.63657: stdout chunk (state=3): >>>holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 960, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251144642560, "block_size": 4096, "block_total": 64479564, "block_available": 61314610, "block_used": 3164954, "inode_total": 16384000, "inode_available": 16302235, "inode_used": 81765, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["ethtest0", "eth0", "peerethtest0", "lo"], "ansible_ethtest0": {"device": "ethtest0", "macaddress": "56:10:7a:3f:31:3a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::928c:aece:b50f:aeb4", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp<<< 32134 1727204456.63673: stdout chunk (state=3): >>>_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ea:ec:31:9c:5f:fe", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e8ec:31ff:fe9c:5ffe", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::928c:aece:b50f:aeb4", "fe80::4a44:1e77:128f:34e8", "fe80::e8ec:31ff:fe9c:5ffe"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8", "fe80::928c:aece:b50f:aeb4", "fe80::e8ec:31ff:fe9c:5ffe"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 32134 1727204456.65877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204456.65881: stdout chunk (state=3): >>><<< 32134 1727204456.66096: stderr chunk (state=3): >>><<< 32134 1727204456.66104: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.67333984375, "5m": 0.6875, "15m": 0.46923828125}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "56", "epoch": "1727204456", "epoch_int": "1727204456", "date": "2024-09-24", "time": "15:00:56", "iso8601_micro": "2024-09-24T19:00:56.225166Z", "iso8601": "2024-09-24T19:00:56Z", "iso8601_basic": "20240924T150056225166", "iso8601_basic_short": "20240924T150056", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2826, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 891, "free": 2826}, "nocache": {"free": 3464, "used": 253}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 960, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251144642560, "block_size": 4096, "block_total": 64479564, "block_available": 61314610, "block_used": 3164954, "inode_total": 16384000, "inode_available": 16302235, "inode_used": 81765, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["ethtest0", "eth0", "peerethtest0", "lo"], "ansible_ethtest0": {"device": "ethtest0", "macaddress": "56:10:7a:3f:31:3a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::928c:aece:b50f:aeb4", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ea:ec:31:9c:5f:fe", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e8ec:31ff:fe9c:5ffe", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::928c:aece:b50f:aeb4", "fe80::4a44:1e77:128f:34e8", "fe80::e8ec:31ff:fe9c:5ffe"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8", "fe80::928c:aece:b50f:aeb4", "fe80::e8ec:31ff:fe9c:5ffe"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204456.66812: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204455.7424555-33651-128695671363160/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204456.66845: _low_level_execute_command(): starting 32134 1727204456.66858: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204455.7424555-33651-128695671363160/ > /dev/null 2>&1 && sleep 0' 32134 1727204456.67515: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204456.67538: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204456.67573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204456.67625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204456.67629: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204456.67631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204456.67693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204456.67697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204456.67740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204456.69737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204456.69741: stdout chunk (state=3): >>><<< 32134 1727204456.69744: stderr chunk (state=3): >>><<< 32134 1727204456.69894: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204456.69899: handler run complete 32134 1727204456.70054: variable 'ansible_facts' from source: unknown 32134 1727204456.70258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204456.70673: variable 'ansible_facts' from source: unknown 32134 1727204456.70763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204456.70913: attempt loop complete, returning result 32134 1727204456.70921: _execute() done 32134 1727204456.70924: dumping result to json 32134 1727204456.70955: done dumping result, returning 32134 1727204456.70963: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-753f-5162-000000000521] 32134 1727204456.70968: sending task result for task 12b410aa-8751-753f-5162-000000000521 ok: [managed-node2] 32134 1727204456.71826: no more pending results, returning what we have 32134 1727204456.71829: results queue empty 32134 1727204456.71830: checking for any_errors_fatal 32134 1727204456.71831: done checking for any_errors_fatal 32134 1727204456.71832: checking for max_fail_percentage 32134 1727204456.71834: done checking for max_fail_percentage 32134 1727204456.71835: checking to see if all hosts have failed and the running result is not ok 32134 1727204456.71836: done checking to see if all hosts have failed 32134 1727204456.71836: getting the remaining hosts for this loop 32134 1727204456.71837: done getting the remaining hosts for this loop 32134 1727204456.71840: getting the next task for host managed-node2 32134 1727204456.71845: done getting next task for host managed-node2 32134 1727204456.71846: ^ task is: TASK: meta (flush_handlers) 32134 1727204456.71848: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204456.71851: getting variables 32134 1727204456.71852: in VariableManager get_vars() 32134 1727204456.71878: Calling all_inventory to load vars for managed-node2 32134 1727204456.71880: Calling groups_inventory to load vars for managed-node2 32134 1727204456.71882: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204456.71894: Calling all_plugins_play to load vars for managed-node2 32134 1727204456.71896: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204456.71900: Calling groups_plugins_play to load vars for managed-node2 32134 1727204456.72418: done sending task result for task 12b410aa-8751-753f-5162-000000000521 32134 1727204456.72422: WORKER PROCESS EXITING 32134 1727204456.73113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204456.74727: done with get_vars() 32134 1727204456.74753: done getting variables 32134 1727204456.74820: in VariableManager get_vars() 32134 1727204456.74831: Calling all_inventory to load vars for managed-node2 32134 1727204456.74833: Calling groups_inventory to load vars for managed-node2 32134 1727204456.74835: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204456.74839: Calling all_plugins_play to load vars for managed-node2 32134 1727204456.74841: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204456.74843: Calling groups_plugins_play to load vars for managed-node2 32134 1727204456.79440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204456.81053: done with get_vars() 32134 1727204456.81081: done queuing things up, now waiting for results queue to drain 32134 1727204456.81083: results queue empty 32134 1727204456.81084: checking for any_errors_fatal 32134 1727204456.81087: done checking for any_errors_fatal 32134 1727204456.81088: checking for max_fail_percentage 32134 1727204456.81090: done checking for max_fail_percentage 32134 1727204456.83416: checking to see if all hosts have failed and the running result is not ok 32134 1727204456.83417: done checking to see if all hosts have failed 32134 1727204456.83419: getting the remaining hosts for this loop 32134 1727204456.83420: done getting the remaining hosts for this loop 32134 1727204456.83424: getting the next task for host managed-node2 32134 1727204456.83427: done getting next task for host managed-node2 32134 1727204456.83430: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 32134 1727204456.83432: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204456.83441: getting variables 32134 1727204456.83442: in VariableManager get_vars() 32134 1727204456.83457: Calling all_inventory to load vars for managed-node2 32134 1727204456.83459: Calling groups_inventory to load vars for managed-node2 32134 1727204456.83460: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204456.83465: Calling all_plugins_play to load vars for managed-node2 32134 1727204456.83468: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204456.83471: Calling groups_plugins_play to load vars for managed-node2 32134 1727204456.84647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204456.86239: done with get_vars() 32134 1727204456.86260: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:00:56 -0400 (0:00:01.174) 0:00:31.267 ***** 32134 1727204456.86330: entering _queue_task() for managed-node2/include_tasks 32134 1727204456.86670: worker is 1 (out of 1 available) 32134 1727204456.86685: exiting _queue_task() for managed-node2/include_tasks 32134 1727204456.86700: done queuing things up, now waiting for results queue to drain 32134 1727204456.86703: waiting for pending results... 32134 1727204456.86912: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 32134 1727204456.86998: in run() - task 12b410aa-8751-753f-5162-000000000084 32134 1727204456.87011: variable 'ansible_search_path' from source: unknown 32134 1727204456.87016: variable 'ansible_search_path' from source: unknown 32134 1727204456.87056: calling self._execute() 32134 1727204456.87140: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204456.87146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204456.87165: variable 'omit' from source: magic vars 32134 1727204456.87495: variable 'ansible_distribution_major_version' from source: facts 32134 1727204456.87508: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204456.87517: _execute() done 32134 1727204456.87521: dumping result to json 32134 1727204456.87526: done dumping result, returning 32134 1727204456.87533: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-753f-5162-000000000084] 32134 1727204456.87539: sending task result for task 12b410aa-8751-753f-5162-000000000084 32134 1727204456.87688: no more pending results, returning what we have 32134 1727204456.87696: in VariableManager get_vars() 32134 1727204456.87745: Calling all_inventory to load vars for managed-node2 32134 1727204456.87749: Calling groups_inventory to load vars for managed-node2 32134 1727204456.87751: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204456.87766: Calling all_plugins_play to load vars for managed-node2 32134 1727204456.87769: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204456.87772: Calling groups_plugins_play to load vars for managed-node2 32134 1727204456.88408: done sending task result for task 12b410aa-8751-753f-5162-000000000084 32134 1727204456.88413: WORKER PROCESS EXITING 32134 1727204456.89064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204456.91259: done with get_vars() 32134 1727204456.91295: variable 'ansible_search_path' from source: unknown 32134 1727204456.91297: variable 'ansible_search_path' from source: unknown 32134 1727204456.91332: we have included files to process 32134 1727204456.91333: generating all_blocks data 32134 1727204456.91335: done generating all_blocks data 32134 1727204456.91336: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32134 1727204456.91338: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32134 1727204456.91340: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32134 1727204456.92070: done processing included file 32134 1727204456.92074: iterating over new_blocks loaded from include file 32134 1727204456.92076: in VariableManager get_vars() 32134 1727204456.92105: done with get_vars() 32134 1727204456.92107: filtering new block on tags 32134 1727204456.92127: done filtering new block on tags 32134 1727204456.92131: in VariableManager get_vars() 32134 1727204456.92157: done with get_vars() 32134 1727204456.92160: filtering new block on tags 32134 1727204456.92185: done filtering new block on tags 32134 1727204456.92195: in VariableManager get_vars() 32134 1727204456.92222: done with get_vars() 32134 1727204456.92225: filtering new block on tags 32134 1727204456.92247: done filtering new block on tags 32134 1727204456.92250: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 32134 1727204456.92256: extending task lists for all hosts with included blocks 32134 1727204456.92801: done extending task lists 32134 1727204456.92803: done processing included files 32134 1727204456.92804: results queue empty 32134 1727204456.92805: checking for any_errors_fatal 32134 1727204456.92807: done checking for any_errors_fatal 32134 1727204456.92808: checking for max_fail_percentage 32134 1727204456.92809: done checking for max_fail_percentage 32134 1727204456.92811: checking to see if all hosts have failed and the running result is not ok 32134 1727204456.92812: done checking to see if all hosts have failed 32134 1727204456.92812: getting the remaining hosts for this loop 32134 1727204456.92814: done getting the remaining hosts for this loop 32134 1727204456.92818: getting the next task for host managed-node2 32134 1727204456.92822: done getting next task for host managed-node2 32134 1727204456.92825: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 32134 1727204456.92828: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204456.92839: getting variables 32134 1727204456.92840: in VariableManager get_vars() 32134 1727204456.92857: Calling all_inventory to load vars for managed-node2 32134 1727204456.92861: Calling groups_inventory to load vars for managed-node2 32134 1727204456.92864: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204456.92870: Calling all_plugins_play to load vars for managed-node2 32134 1727204456.92876: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204456.92880: Calling groups_plugins_play to load vars for managed-node2 32134 1727204456.94972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204456.96581: done with get_vars() 32134 1727204456.96607: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:00:56 -0400 (0:00:00.103) 0:00:31.370 ***** 32134 1727204456.96675: entering _queue_task() for managed-node2/setup 32134 1727204456.96967: worker is 1 (out of 1 available) 32134 1727204456.96983: exiting _queue_task() for managed-node2/setup 32134 1727204456.96998: done queuing things up, now waiting for results queue to drain 32134 1727204456.97000: waiting for pending results... 32134 1727204456.97200: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 32134 1727204456.97332: in run() - task 12b410aa-8751-753f-5162-000000000562 32134 1727204456.97349: variable 'ansible_search_path' from source: unknown 32134 1727204456.97353: variable 'ansible_search_path' from source: unknown 32134 1727204456.97383: calling self._execute() 32134 1727204456.97472: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204456.97480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204456.97492: variable 'omit' from source: magic vars 32134 1727204456.97821: variable 'ansible_distribution_major_version' from source: facts 32134 1727204456.97832: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204456.98024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204456.99737: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204456.99799: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204456.99832: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204456.99867: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204456.99888: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204456.99959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204456.99987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204457.00010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204457.00044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204457.00056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204457.00109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204457.00130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204457.00151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204457.00182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204457.00202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204457.00343: variable '__network_required_facts' from source: role '' defaults 32134 1727204457.00352: variable 'ansible_facts' from source: unknown 32134 1727204457.01167: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 32134 1727204457.01173: when evaluation is False, skipping this task 32134 1727204457.01176: _execute() done 32134 1727204457.01179: dumping result to json 32134 1727204457.01182: done dumping result, returning 32134 1727204457.01194: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-753f-5162-000000000562] 32134 1727204457.01197: sending task result for task 12b410aa-8751-753f-5162-000000000562 32134 1727204457.01298: done sending task result for task 12b410aa-8751-753f-5162-000000000562 32134 1727204457.01301: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32134 1727204457.01348: no more pending results, returning what we have 32134 1727204457.01352: results queue empty 32134 1727204457.01353: checking for any_errors_fatal 32134 1727204457.01355: done checking for any_errors_fatal 32134 1727204457.01355: checking for max_fail_percentage 32134 1727204457.01357: done checking for max_fail_percentage 32134 1727204457.01358: checking to see if all hosts have failed and the running result is not ok 32134 1727204457.01359: done checking to see if all hosts have failed 32134 1727204457.01360: getting the remaining hosts for this loop 32134 1727204457.01362: done getting the remaining hosts for this loop 32134 1727204457.01366: getting the next task for host managed-node2 32134 1727204457.01375: done getting next task for host managed-node2 32134 1727204457.01380: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 32134 1727204457.01383: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204457.01401: getting variables 32134 1727204457.01403: in VariableManager get_vars() 32134 1727204457.01444: Calling all_inventory to load vars for managed-node2 32134 1727204457.01448: Calling groups_inventory to load vars for managed-node2 32134 1727204457.01451: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204457.01462: Calling all_plugins_play to load vars for managed-node2 32134 1727204457.01466: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204457.01469: Calling groups_plugins_play to load vars for managed-node2 32134 1727204457.02876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204457.04492: done with get_vars() 32134 1727204457.04516: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:00:57 -0400 (0:00:00.079) 0:00:31.449 ***** 32134 1727204457.04599: entering _queue_task() for managed-node2/stat 32134 1727204457.04864: worker is 1 (out of 1 available) 32134 1727204457.04883: exiting _queue_task() for managed-node2/stat 32134 1727204457.04898: done queuing things up, now waiting for results queue to drain 32134 1727204457.04900: waiting for pending results... 32134 1727204457.05099: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 32134 1727204457.05208: in run() - task 12b410aa-8751-753f-5162-000000000564 32134 1727204457.05223: variable 'ansible_search_path' from source: unknown 32134 1727204457.05227: variable 'ansible_search_path' from source: unknown 32134 1727204457.05264: calling self._execute() 32134 1727204457.05356: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204457.05361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204457.05371: variable 'omit' from source: magic vars 32134 1727204457.05696: variable 'ansible_distribution_major_version' from source: facts 32134 1727204457.05706: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204457.05851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204457.06075: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204457.06123: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204457.06150: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204457.06179: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204457.06282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204457.06307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204457.06339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204457.06358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204457.06435: variable '__network_is_ostree' from source: set_fact 32134 1727204457.06441: Evaluated conditional (not __network_is_ostree is defined): False 32134 1727204457.06444: when evaluation is False, skipping this task 32134 1727204457.06447: _execute() done 32134 1727204457.06456: dumping result to json 32134 1727204457.06459: done dumping result, returning 32134 1727204457.06462: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-753f-5162-000000000564] 32134 1727204457.06469: sending task result for task 12b410aa-8751-753f-5162-000000000564 32134 1727204457.06564: done sending task result for task 12b410aa-8751-753f-5162-000000000564 32134 1727204457.06567: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 32134 1727204457.06623: no more pending results, returning what we have 32134 1727204457.06627: results queue empty 32134 1727204457.06628: checking for any_errors_fatal 32134 1727204457.06635: done checking for any_errors_fatal 32134 1727204457.06636: checking for max_fail_percentage 32134 1727204457.06638: done checking for max_fail_percentage 32134 1727204457.06639: checking to see if all hosts have failed and the running result is not ok 32134 1727204457.06640: done checking to see if all hosts have failed 32134 1727204457.06641: getting the remaining hosts for this loop 32134 1727204457.06642: done getting the remaining hosts for this loop 32134 1727204457.06646: getting the next task for host managed-node2 32134 1727204457.06654: done getting next task for host managed-node2 32134 1727204457.06658: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 32134 1727204457.06661: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204457.06676: getting variables 32134 1727204457.06677: in VariableManager get_vars() 32134 1727204457.06724: Calling all_inventory to load vars for managed-node2 32134 1727204457.06728: Calling groups_inventory to load vars for managed-node2 32134 1727204457.06730: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204457.06740: Calling all_plugins_play to load vars for managed-node2 32134 1727204457.06744: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204457.06747: Calling groups_plugins_play to load vars for managed-node2 32134 1727204457.07976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204457.09585: done with get_vars() 32134 1727204457.09610: done getting variables 32134 1727204457.09663: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:00:57 -0400 (0:00:00.050) 0:00:31.500 ***** 32134 1727204457.09695: entering _queue_task() for managed-node2/set_fact 32134 1727204457.09961: worker is 1 (out of 1 available) 32134 1727204457.09976: exiting _queue_task() for managed-node2/set_fact 32134 1727204457.09988: done queuing things up, now waiting for results queue to drain 32134 1727204457.09992: waiting for pending results... 32134 1727204457.10193: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 32134 1727204457.10306: in run() - task 12b410aa-8751-753f-5162-000000000565 32134 1727204457.10323: variable 'ansible_search_path' from source: unknown 32134 1727204457.10328: variable 'ansible_search_path' from source: unknown 32134 1727204457.10361: calling self._execute() 32134 1727204457.10447: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204457.10459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204457.10472: variable 'omit' from source: magic vars 32134 1727204457.10794: variable 'ansible_distribution_major_version' from source: facts 32134 1727204457.10806: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204457.10952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204457.11172: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204457.11216: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204457.11246: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204457.11277: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204457.11648: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204457.11672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204457.11696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204457.11719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204457.11795: variable '__network_is_ostree' from source: set_fact 32134 1727204457.11803: Evaluated conditional (not __network_is_ostree is defined): False 32134 1727204457.11806: when evaluation is False, skipping this task 32134 1727204457.11810: _execute() done 32134 1727204457.11815: dumping result to json 32134 1727204457.11818: done dumping result, returning 32134 1727204457.11826: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-753f-5162-000000000565] 32134 1727204457.11831: sending task result for task 12b410aa-8751-753f-5162-000000000565 32134 1727204457.11922: done sending task result for task 12b410aa-8751-753f-5162-000000000565 32134 1727204457.11925: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 32134 1727204457.11977: no more pending results, returning what we have 32134 1727204457.11981: results queue empty 32134 1727204457.11982: checking for any_errors_fatal 32134 1727204457.11990: done checking for any_errors_fatal 32134 1727204457.11991: checking for max_fail_percentage 32134 1727204457.11993: done checking for max_fail_percentage 32134 1727204457.11994: checking to see if all hosts have failed and the running result is not ok 32134 1727204457.11996: done checking to see if all hosts have failed 32134 1727204457.11997: getting the remaining hosts for this loop 32134 1727204457.11998: done getting the remaining hosts for this loop 32134 1727204457.12003: getting the next task for host managed-node2 32134 1727204457.12015: done getting next task for host managed-node2 32134 1727204457.12020: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 32134 1727204457.12023: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204457.12039: getting variables 32134 1727204457.12041: in VariableManager get_vars() 32134 1727204457.12078: Calling all_inventory to load vars for managed-node2 32134 1727204457.12081: Calling groups_inventory to load vars for managed-node2 32134 1727204457.12084: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204457.12101: Calling all_plugins_play to load vars for managed-node2 32134 1727204457.12105: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204457.12109: Calling groups_plugins_play to load vars for managed-node2 32134 1727204457.13448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204457.15072: done with get_vars() 32134 1727204457.15099: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:00:57 -0400 (0:00:00.054) 0:00:31.555 ***** 32134 1727204457.15185: entering _queue_task() for managed-node2/service_facts 32134 1727204457.15464: worker is 1 (out of 1 available) 32134 1727204457.15479: exiting _queue_task() for managed-node2/service_facts 32134 1727204457.15494: done queuing things up, now waiting for results queue to drain 32134 1727204457.15496: waiting for pending results... 32134 1727204457.15698: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 32134 1727204457.15795: in run() - task 12b410aa-8751-753f-5162-000000000567 32134 1727204457.15810: variable 'ansible_search_path' from source: unknown 32134 1727204457.15816: variable 'ansible_search_path' from source: unknown 32134 1727204457.15849: calling self._execute() 32134 1727204457.15936: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204457.15941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204457.15955: variable 'omit' from source: magic vars 32134 1727204457.16283: variable 'ansible_distribution_major_version' from source: facts 32134 1727204457.16288: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204457.16298: variable 'omit' from source: magic vars 32134 1727204457.16344: variable 'omit' from source: magic vars 32134 1727204457.16372: variable 'omit' from source: magic vars 32134 1727204457.16413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204457.16443: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204457.16462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204457.16478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204457.16496: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204457.16521: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204457.16525: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204457.16530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204457.16620: Set connection var ansible_timeout to 10 32134 1727204457.16631: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204457.16634: Set connection var ansible_connection to ssh 32134 1727204457.16637: Set connection var ansible_shell_type to sh 32134 1727204457.16645: Set connection var ansible_shell_executable to /bin/sh 32134 1727204457.16651: Set connection var ansible_pipelining to False 32134 1727204457.16671: variable 'ansible_shell_executable' from source: unknown 32134 1727204457.16674: variable 'ansible_connection' from source: unknown 32134 1727204457.16677: variable 'ansible_module_compression' from source: unknown 32134 1727204457.16680: variable 'ansible_shell_type' from source: unknown 32134 1727204457.16685: variable 'ansible_shell_executable' from source: unknown 32134 1727204457.16687: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204457.16695: variable 'ansible_pipelining' from source: unknown 32134 1727204457.16697: variable 'ansible_timeout' from source: unknown 32134 1727204457.16703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204457.16872: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204457.16885: variable 'omit' from source: magic vars 32134 1727204457.16892: starting attempt loop 32134 1727204457.16895: running the handler 32134 1727204457.16910: _low_level_execute_command(): starting 32134 1727204457.16918: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204457.17474: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204457.17478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204457.17483: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204457.17486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204457.17545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204457.17553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204457.17600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204457.19368: stdout chunk (state=3): >>>/root <<< 32134 1727204457.19475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204457.19537: stderr chunk (state=3): >>><<< 32134 1727204457.19540: stdout chunk (state=3): >>><<< 32134 1727204457.19561: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204457.19574: _low_level_execute_command(): starting 32134 1727204457.19580: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204457.1956081-33697-187147708152205 `" && echo ansible-tmp-1727204457.1956081-33697-187147708152205="` echo /root/.ansible/tmp/ansible-tmp-1727204457.1956081-33697-187147708152205 `" ) && sleep 0' 32134 1727204457.20062: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204457.20066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204457.20068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204457.20078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204457.20129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204457.20134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204457.20177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204457.22324: stdout chunk (state=3): >>>ansible-tmp-1727204457.1956081-33697-187147708152205=/root/.ansible/tmp/ansible-tmp-1727204457.1956081-33697-187147708152205 <<< 32134 1727204457.22439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204457.22497: stderr chunk (state=3): >>><<< 32134 1727204457.22501: stdout chunk (state=3): >>><<< 32134 1727204457.22518: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204457.1956081-33697-187147708152205=/root/.ansible/tmp/ansible-tmp-1727204457.1956081-33697-187147708152205 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204457.22570: variable 'ansible_module_compression' from source: unknown 32134 1727204457.22610: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 32134 1727204457.22646: variable 'ansible_facts' from source: unknown 32134 1727204457.22713: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204457.1956081-33697-187147708152205/AnsiballZ_service_facts.py 32134 1727204457.22831: Sending initial data 32134 1727204457.22835: Sent initial data (162 bytes) 32134 1727204457.23319: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204457.23324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204457.23327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204457.23330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204457.23381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204457.23386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204457.23431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204457.25153: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204457.25186: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204457.25224: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpwbx29puj /root/.ansible/tmp/ansible-tmp-1727204457.1956081-33697-187147708152205/AnsiballZ_service_facts.py <<< 32134 1727204457.25231: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204457.1956081-33697-187147708152205/AnsiballZ_service_facts.py" <<< 32134 1727204457.25259: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpwbx29puj" to remote "/root/.ansible/tmp/ansible-tmp-1727204457.1956081-33697-187147708152205/AnsiballZ_service_facts.py" <<< 32134 1727204457.25264: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204457.1956081-33697-187147708152205/AnsiballZ_service_facts.py" <<< 32134 1727204457.26066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204457.26143: stderr chunk (state=3): >>><<< 32134 1727204457.26147: stdout chunk (state=3): >>><<< 32134 1727204457.26170: done transferring module to remote 32134 1727204457.26181: _low_level_execute_command(): starting 32134 1727204457.26187: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204457.1956081-33697-187147708152205/ /root/.ansible/tmp/ansible-tmp-1727204457.1956081-33697-187147708152205/AnsiballZ_service_facts.py && sleep 0' 32134 1727204457.26678: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204457.26681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204457.26684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32134 1727204457.26686: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204457.26688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204457.26749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204457.26759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204457.26793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204457.28779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204457.28839: stderr chunk (state=3): >>><<< 32134 1727204457.28843: stdout chunk (state=3): >>><<< 32134 1727204457.28858: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204457.28866: _low_level_execute_command(): starting 32134 1727204457.28869: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204457.1956081-33697-187147708152205/AnsiballZ_service_facts.py && sleep 0' 32134 1727204457.29372: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204457.29376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204457.29379: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204457.29382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204457.29444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204457.29451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204457.29500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204459.33905: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "seria<<< 32134 1727204459.33921: stdout chunk (state=3): >>>l-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service"<<< 32134 1727204459.33952: stdout chunk (state=3): >>>, "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inact<<< 32134 1727204459.33965: stdout chunk (state=3): >>>ive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd<<< 32134 1727204459.33991: stdout chunk (state=3): >>>"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 32134 1727204459.35667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204459.35733: stderr chunk (state=3): >>><<< 32134 1727204459.35737: stdout chunk (state=3): >>><<< 32134 1727204459.35773: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204459.36454: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204457.1956081-33697-187147708152205/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204459.36516: _low_level_execute_command(): starting 32134 1727204459.36519: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204457.1956081-33697-187147708152205/ > /dev/null 2>&1 && sleep 0' 32134 1727204459.36958: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204459.36962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204459.36964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32134 1727204459.36967: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204459.36969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204459.37030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204459.37032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204459.37069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204459.39069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204459.39121: stderr chunk (state=3): >>><<< 32134 1727204459.39124: stdout chunk (state=3): >>><<< 32134 1727204459.39140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204459.39148: handler run complete 32134 1727204459.39319: variable 'ansible_facts' from source: unknown 32134 1727204459.39458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204459.40009: variable 'ansible_facts' from source: unknown 32134 1727204459.40132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204459.40330: attempt loop complete, returning result 32134 1727204459.40337: _execute() done 32134 1727204459.40340: dumping result to json 32134 1727204459.40391: done dumping result, returning 32134 1727204459.40400: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-753f-5162-000000000567] 32134 1727204459.40406: sending task result for task 12b410aa-8751-753f-5162-000000000567 32134 1727204459.41246: done sending task result for task 12b410aa-8751-753f-5162-000000000567 32134 1727204459.41250: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32134 1727204459.41307: no more pending results, returning what we have 32134 1727204459.41309: results queue empty 32134 1727204459.41310: checking for any_errors_fatal 32134 1727204459.41313: done checking for any_errors_fatal 32134 1727204459.41314: checking for max_fail_percentage 32134 1727204459.41315: done checking for max_fail_percentage 32134 1727204459.41315: checking to see if all hosts have failed and the running result is not ok 32134 1727204459.41316: done checking to see if all hosts have failed 32134 1727204459.41317: getting the remaining hosts for this loop 32134 1727204459.41318: done getting the remaining hosts for this loop 32134 1727204459.41321: getting the next task for host managed-node2 32134 1727204459.41327: done getting next task for host managed-node2 32134 1727204459.41330: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 32134 1727204459.41332: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204459.41339: getting variables 32134 1727204459.41341: in VariableManager get_vars() 32134 1727204459.41366: Calling all_inventory to load vars for managed-node2 32134 1727204459.41368: Calling groups_inventory to load vars for managed-node2 32134 1727204459.41369: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204459.41377: Calling all_plugins_play to load vars for managed-node2 32134 1727204459.41379: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204459.41381: Calling groups_plugins_play to load vars for managed-node2 32134 1727204459.42559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204459.44161: done with get_vars() 32134 1727204459.44183: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:00:59 -0400 (0:00:02.290) 0:00:33.846 ***** 32134 1727204459.44264: entering _queue_task() for managed-node2/package_facts 32134 1727204459.44511: worker is 1 (out of 1 available) 32134 1727204459.44527: exiting _queue_task() for managed-node2/package_facts 32134 1727204459.44538: done queuing things up, now waiting for results queue to drain 32134 1727204459.44541: waiting for pending results... 32134 1727204459.44737: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 32134 1727204459.44846: in run() - task 12b410aa-8751-753f-5162-000000000568 32134 1727204459.44860: variable 'ansible_search_path' from source: unknown 32134 1727204459.44864: variable 'ansible_search_path' from source: unknown 32134 1727204459.44901: calling self._execute() 32134 1727204459.44990: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204459.44998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204459.45006: variable 'omit' from source: magic vars 32134 1727204459.45339: variable 'ansible_distribution_major_version' from source: facts 32134 1727204459.45350: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204459.45357: variable 'omit' from source: magic vars 32134 1727204459.45404: variable 'omit' from source: magic vars 32134 1727204459.45441: variable 'omit' from source: magic vars 32134 1727204459.45475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204459.45510: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204459.45536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204459.45551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204459.45563: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204459.45592: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204459.45595: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204459.45600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204459.45686: Set connection var ansible_timeout to 10 32134 1727204459.45701: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204459.45704: Set connection var ansible_connection to ssh 32134 1727204459.45707: Set connection var ansible_shell_type to sh 32134 1727204459.45716: Set connection var ansible_shell_executable to /bin/sh 32134 1727204459.45723: Set connection var ansible_pipelining to False 32134 1727204459.45743: variable 'ansible_shell_executable' from source: unknown 32134 1727204459.45752: variable 'ansible_connection' from source: unknown 32134 1727204459.45755: variable 'ansible_module_compression' from source: unknown 32134 1727204459.45758: variable 'ansible_shell_type' from source: unknown 32134 1727204459.45762: variable 'ansible_shell_executable' from source: unknown 32134 1727204459.45765: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204459.45768: variable 'ansible_pipelining' from source: unknown 32134 1727204459.45771: variable 'ansible_timeout' from source: unknown 32134 1727204459.45778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204459.45952: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204459.45962: variable 'omit' from source: magic vars 32134 1727204459.45967: starting attempt loop 32134 1727204459.45971: running the handler 32134 1727204459.45987: _low_level_execute_command(): starting 32134 1727204459.45994: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204459.46547: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204459.46551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204459.46555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204459.46610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204459.46617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204459.46619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204459.46662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204459.48415: stdout chunk (state=3): >>>/root <<< 32134 1727204459.48603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204459.48746: stderr chunk (state=3): >>><<< 32134 1727204459.48750: stdout chunk (state=3): >>><<< 32134 1727204459.48754: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204459.48757: _low_level_execute_command(): starting 32134 1727204459.48761: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204459.4869893-33727-107607942980561 `" && echo ansible-tmp-1727204459.4869893-33727-107607942980561="` echo /root/.ansible/tmp/ansible-tmp-1727204459.4869893-33727-107607942980561 `" ) && sleep 0' 32134 1727204459.49199: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204459.49226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204459.49272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204459.49291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204459.49333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204459.51409: stdout chunk (state=3): >>>ansible-tmp-1727204459.4869893-33727-107607942980561=/root/.ansible/tmp/ansible-tmp-1727204459.4869893-33727-107607942980561 <<< 32134 1727204459.51616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204459.51620: stdout chunk (state=3): >>><<< 32134 1727204459.51622: stderr chunk (state=3): >>><<< 32134 1727204459.51795: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204459.4869893-33727-107607942980561=/root/.ansible/tmp/ansible-tmp-1727204459.4869893-33727-107607942980561 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204459.51799: variable 'ansible_module_compression' from source: unknown 32134 1727204459.51802: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 32134 1727204459.51835: variable 'ansible_facts' from source: unknown 32134 1727204459.52078: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204459.4869893-33727-107607942980561/AnsiballZ_package_facts.py 32134 1727204459.52268: Sending initial data 32134 1727204459.52278: Sent initial data (162 bytes) 32134 1727204459.52966: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204459.53036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204459.53108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204459.53149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204459.53225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204459.54938: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204459.54992: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204459.55028: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpgprh0o1o /root/.ansible/tmp/ansible-tmp-1727204459.4869893-33727-107607942980561/AnsiballZ_package_facts.py <<< 32134 1727204459.55035: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204459.4869893-33727-107607942980561/AnsiballZ_package_facts.py" <<< 32134 1727204459.55064: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpgprh0o1o" to remote "/root/.ansible/tmp/ansible-tmp-1727204459.4869893-33727-107607942980561/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204459.4869893-33727-107607942980561/AnsiballZ_package_facts.py" <<< 32134 1727204459.57052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204459.57055: stdout chunk (state=3): >>><<< 32134 1727204459.57058: stderr chunk (state=3): >>><<< 32134 1727204459.57060: done transferring module to remote 32134 1727204459.57062: _low_level_execute_command(): starting 32134 1727204459.57064: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204459.4869893-33727-107607942980561/ /root/.ansible/tmp/ansible-tmp-1727204459.4869893-33727-107607942980561/AnsiballZ_package_facts.py && sleep 0' 32134 1727204459.57658: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204459.57708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204459.57725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204459.57751: stderr chunk (state=3): >>>debug2: match found <<< 32134 1727204459.57851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204459.57873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204459.57954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204459.59986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204459.59991: stdout chunk (state=3): >>><<< 32134 1727204459.59994: stderr chunk (state=3): >>><<< 32134 1727204459.60113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204459.60122: _low_level_execute_command(): starting 32134 1727204459.60125: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204459.4869893-33727-107607942980561/AnsiballZ_package_facts.py && sleep 0' 32134 1727204459.60746: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204459.60760: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204459.60884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204459.60941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204459.60982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204460.26056: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 32134 1727204460.26083: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 32134 1727204460.26196: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 32134 1727204460.26270: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 32134 1727204460.26309: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 32134 1727204460.28302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204460.28324: stderr chunk (state=3): >>><<< 32134 1727204460.28334: stdout chunk (state=3): >>><<< 32134 1727204460.28385: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204460.32980: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204459.4869893-33727-107607942980561/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204460.33001: _low_level_execute_command(): starting 32134 1727204460.33008: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204459.4869893-33727-107607942980561/ > /dev/null 2>&1 && sleep 0' 32134 1727204460.33653: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204460.33663: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204460.33676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204460.33695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204460.33708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204460.33719: stderr chunk (state=3): >>>debug2: match not found <<< 32134 1727204460.33732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204460.33747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32134 1727204460.33798: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 32134 1727204460.33802: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32134 1727204460.33804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204460.33807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204460.33810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204460.33812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204460.33818: stderr chunk (state=3): >>>debug2: match found <<< 32134 1727204460.33832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204460.33907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204460.33923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204460.33961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204460.34009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204460.36198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204460.36202: stdout chunk (state=3): >>><<< 32134 1727204460.36204: stderr chunk (state=3): >>><<< 32134 1727204460.36207: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204460.36210: handler run complete 32134 1727204460.37700: variable 'ansible_facts' from source: unknown 32134 1727204460.38653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204460.42568: variable 'ansible_facts' from source: unknown 32134 1727204460.43424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204460.44939: attempt loop complete, returning result 32134 1727204460.44964: _execute() done 32134 1727204460.44972: dumping result to json 32134 1727204460.45310: done dumping result, returning 32134 1727204460.45347: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-753f-5162-000000000568] 32134 1727204460.45350: sending task result for task 12b410aa-8751-753f-5162-000000000568 32134 1727204460.49283: done sending task result for task 12b410aa-8751-753f-5162-000000000568 32134 1727204460.49287: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32134 1727204460.49452: no more pending results, returning what we have 32134 1727204460.49455: results queue empty 32134 1727204460.49456: checking for any_errors_fatal 32134 1727204460.49461: done checking for any_errors_fatal 32134 1727204460.49462: checking for max_fail_percentage 32134 1727204460.49463: done checking for max_fail_percentage 32134 1727204460.49465: checking to see if all hosts have failed and the running result is not ok 32134 1727204460.49466: done checking to see if all hosts have failed 32134 1727204460.49467: getting the remaining hosts for this loop 32134 1727204460.49468: done getting the remaining hosts for this loop 32134 1727204460.49472: getting the next task for host managed-node2 32134 1727204460.49479: done getting next task for host managed-node2 32134 1727204460.49489: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 32134 1727204460.49493: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204460.49504: getting variables 32134 1727204460.49505: in VariableManager get_vars() 32134 1727204460.49541: Calling all_inventory to load vars for managed-node2 32134 1727204460.49545: Calling groups_inventory to load vars for managed-node2 32134 1727204460.49548: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204460.49559: Calling all_plugins_play to load vars for managed-node2 32134 1727204460.49562: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204460.49566: Calling groups_plugins_play to load vars for managed-node2 32134 1727204460.51790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204460.54962: done with get_vars() 32134 1727204460.55015: done getting variables 32134 1727204460.55097: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:01:00 -0400 (0:00:01.108) 0:00:34.955 ***** 32134 1727204460.55150: entering _queue_task() for managed-node2/debug 32134 1727204460.55561: worker is 1 (out of 1 available) 32134 1727204460.55690: exiting _queue_task() for managed-node2/debug 32134 1727204460.55705: done queuing things up, now waiting for results queue to drain 32134 1727204460.55707: waiting for pending results... 32134 1727204460.56007: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 32134 1727204460.56062: in run() - task 12b410aa-8751-753f-5162-000000000085 32134 1727204460.56083: variable 'ansible_search_path' from source: unknown 32134 1727204460.56092: variable 'ansible_search_path' from source: unknown 32134 1727204460.56144: calling self._execute() 32134 1727204460.56268: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204460.56295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204460.56300: variable 'omit' from source: magic vars 32134 1727204460.56782: variable 'ansible_distribution_major_version' from source: facts 32134 1727204460.56786: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204460.56799: variable 'omit' from source: magic vars 32134 1727204460.56856: variable 'omit' from source: magic vars 32134 1727204460.57094: variable 'network_provider' from source: set_fact 32134 1727204460.57097: variable 'omit' from source: magic vars 32134 1727204460.57102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204460.57132: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204460.57161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204460.57187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204460.57209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204460.57259: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204460.57270: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204460.57280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204460.57439: Set connection var ansible_timeout to 10 32134 1727204460.57456: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204460.57548: Set connection var ansible_connection to ssh 32134 1727204460.57552: Set connection var ansible_shell_type to sh 32134 1727204460.57554: Set connection var ansible_shell_executable to /bin/sh 32134 1727204460.57556: Set connection var ansible_pipelining to False 32134 1727204460.57558: variable 'ansible_shell_executable' from source: unknown 32134 1727204460.57562: variable 'ansible_connection' from source: unknown 32134 1727204460.57564: variable 'ansible_module_compression' from source: unknown 32134 1727204460.57566: variable 'ansible_shell_type' from source: unknown 32134 1727204460.57568: variable 'ansible_shell_executable' from source: unknown 32134 1727204460.57571: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204460.57573: variable 'ansible_pipelining' from source: unknown 32134 1727204460.57575: variable 'ansible_timeout' from source: unknown 32134 1727204460.57581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204460.57771: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204460.57794: variable 'omit' from source: magic vars 32134 1727204460.57805: starting attempt loop 32134 1727204460.57816: running the handler 32134 1727204460.57879: handler run complete 32134 1727204460.57906: attempt loop complete, returning result 32134 1727204460.57918: _execute() done 32134 1727204460.57927: dumping result to json 32134 1727204460.57936: done dumping result, returning 32134 1727204460.57983: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-753f-5162-000000000085] 32134 1727204460.57987: sending task result for task 12b410aa-8751-753f-5162-000000000085 ok: [managed-node2] => {} MSG: Using network provider: nm 32134 1727204460.58359: no more pending results, returning what we have 32134 1727204460.58363: results queue empty 32134 1727204460.58364: checking for any_errors_fatal 32134 1727204460.58373: done checking for any_errors_fatal 32134 1727204460.58374: checking for max_fail_percentage 32134 1727204460.58376: done checking for max_fail_percentage 32134 1727204460.58377: checking to see if all hosts have failed and the running result is not ok 32134 1727204460.58378: done checking to see if all hosts have failed 32134 1727204460.58379: getting the remaining hosts for this loop 32134 1727204460.58381: done getting the remaining hosts for this loop 32134 1727204460.58385: getting the next task for host managed-node2 32134 1727204460.58393: done getting next task for host managed-node2 32134 1727204460.58398: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 32134 1727204460.58401: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204460.58416: getting variables 32134 1727204460.58418: in VariableManager get_vars() 32134 1727204460.58461: Calling all_inventory to load vars for managed-node2 32134 1727204460.58465: Calling groups_inventory to load vars for managed-node2 32134 1727204460.58468: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204460.58479: Calling all_plugins_play to load vars for managed-node2 32134 1727204460.58483: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204460.58487: Calling groups_plugins_play to load vars for managed-node2 32134 1727204460.58505: done sending task result for task 12b410aa-8751-753f-5162-000000000085 32134 1727204460.58508: WORKER PROCESS EXITING 32134 1727204460.61015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204460.64232: done with get_vars() 32134 1727204460.64278: done getting variables 32134 1727204460.64354: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:01:00 -0400 (0:00:00.092) 0:00:35.047 ***** 32134 1727204460.64399: entering _queue_task() for managed-node2/fail 32134 1727204460.64995: worker is 1 (out of 1 available) 32134 1727204460.65007: exiting _queue_task() for managed-node2/fail 32134 1727204460.65021: done queuing things up, now waiting for results queue to drain 32134 1727204460.65023: waiting for pending results... 32134 1727204460.65148: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 32134 1727204460.65297: in run() - task 12b410aa-8751-753f-5162-000000000086 32134 1727204460.65322: variable 'ansible_search_path' from source: unknown 32134 1727204460.65331: variable 'ansible_search_path' from source: unknown 32134 1727204460.65382: calling self._execute() 32134 1727204460.65513: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204460.65530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204460.65548: variable 'omit' from source: magic vars 32134 1727204460.66019: variable 'ansible_distribution_major_version' from source: facts 32134 1727204460.66049: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204460.66230: variable 'network_state' from source: role '' defaults 32134 1727204460.66258: Evaluated conditional (network_state != {}): False 32134 1727204460.66268: when evaluation is False, skipping this task 32134 1727204460.66277: _execute() done 32134 1727204460.66367: dumping result to json 32134 1727204460.66371: done dumping result, returning 32134 1727204460.66374: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-753f-5162-000000000086] 32134 1727204460.66378: sending task result for task 12b410aa-8751-753f-5162-000000000086 32134 1727204460.66463: done sending task result for task 12b410aa-8751-753f-5162-000000000086 32134 1727204460.66580: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32134 1727204460.66651: no more pending results, returning what we have 32134 1727204460.66658: results queue empty 32134 1727204460.66659: checking for any_errors_fatal 32134 1727204460.66670: done checking for any_errors_fatal 32134 1727204460.66671: checking for max_fail_percentage 32134 1727204460.66673: done checking for max_fail_percentage 32134 1727204460.66675: checking to see if all hosts have failed and the running result is not ok 32134 1727204460.66676: done checking to see if all hosts have failed 32134 1727204460.66677: getting the remaining hosts for this loop 32134 1727204460.66679: done getting the remaining hosts for this loop 32134 1727204460.66684: getting the next task for host managed-node2 32134 1727204460.66694: done getting next task for host managed-node2 32134 1727204460.66699: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 32134 1727204460.66703: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204460.66724: getting variables 32134 1727204460.66726: in VariableManager get_vars() 32134 1727204460.66772: Calling all_inventory to load vars for managed-node2 32134 1727204460.66776: Calling groups_inventory to load vars for managed-node2 32134 1727204460.66779: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204460.66910: Calling all_plugins_play to load vars for managed-node2 32134 1727204460.66917: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204460.66923: Calling groups_plugins_play to load vars for managed-node2 32134 1727204460.69551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204460.72763: done with get_vars() 32134 1727204460.72824: done getting variables 32134 1727204460.72905: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:01:00 -0400 (0:00:00.085) 0:00:35.133 ***** 32134 1727204460.72947: entering _queue_task() for managed-node2/fail 32134 1727204460.73519: worker is 1 (out of 1 available) 32134 1727204460.73532: exiting _queue_task() for managed-node2/fail 32134 1727204460.73545: done queuing things up, now waiting for results queue to drain 32134 1727204460.73547: waiting for pending results... 32134 1727204460.73731: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 32134 1727204460.73874: in run() - task 12b410aa-8751-753f-5162-000000000087 32134 1727204460.73909: variable 'ansible_search_path' from source: unknown 32134 1727204460.73923: variable 'ansible_search_path' from source: unknown 32134 1727204460.73970: calling self._execute() 32134 1727204460.74097: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204460.74124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204460.74195: variable 'omit' from source: magic vars 32134 1727204460.74637: variable 'ansible_distribution_major_version' from source: facts 32134 1727204460.74664: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204460.74841: variable 'network_state' from source: role '' defaults 32134 1727204460.74859: Evaluated conditional (network_state != {}): False 32134 1727204460.74875: when evaluation is False, skipping this task 32134 1727204460.74886: _execute() done 32134 1727204460.74896: dumping result to json 32134 1727204460.74906: done dumping result, returning 32134 1727204460.74922: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-753f-5162-000000000087] 32134 1727204460.74984: sending task result for task 12b410aa-8751-753f-5162-000000000087 32134 1727204460.75066: done sending task result for task 12b410aa-8751-753f-5162-000000000087 32134 1727204460.75069: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32134 1727204460.75146: no more pending results, returning what we have 32134 1727204460.75151: results queue empty 32134 1727204460.75152: checking for any_errors_fatal 32134 1727204460.75164: done checking for any_errors_fatal 32134 1727204460.75165: checking for max_fail_percentage 32134 1727204460.75167: done checking for max_fail_percentage 32134 1727204460.75168: checking to see if all hosts have failed and the running result is not ok 32134 1727204460.75169: done checking to see if all hosts have failed 32134 1727204460.75170: getting the remaining hosts for this loop 32134 1727204460.75172: done getting the remaining hosts for this loop 32134 1727204460.75177: getting the next task for host managed-node2 32134 1727204460.75184: done getting next task for host managed-node2 32134 1727204460.75191: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 32134 1727204460.75194: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204460.75216: getting variables 32134 1727204460.75219: in VariableManager get_vars() 32134 1727204460.75264: Calling all_inventory to load vars for managed-node2 32134 1727204460.75268: Calling groups_inventory to load vars for managed-node2 32134 1727204460.75271: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204460.75287: Calling all_plugins_play to load vars for managed-node2 32134 1727204460.75408: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204460.75419: Calling groups_plugins_play to load vars for managed-node2 32134 1727204460.78017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204460.81078: done with get_vars() 32134 1727204460.81117: done getting variables 32134 1727204460.81172: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:01:00 -0400 (0:00:00.082) 0:00:35.215 ***** 32134 1727204460.81202: entering _queue_task() for managed-node2/fail 32134 1727204460.81486: worker is 1 (out of 1 available) 32134 1727204460.81502: exiting _queue_task() for managed-node2/fail 32134 1727204460.81517: done queuing things up, now waiting for results queue to drain 32134 1727204460.81519: waiting for pending results... 32134 1727204460.81722: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 32134 1727204460.81816: in run() - task 12b410aa-8751-753f-5162-000000000088 32134 1727204460.81832: variable 'ansible_search_path' from source: unknown 32134 1727204460.81837: variable 'ansible_search_path' from source: unknown 32134 1727204460.81872: calling self._execute() 32134 1727204460.81966: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204460.81979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204460.81987: variable 'omit' from source: magic vars 32134 1727204460.82333: variable 'ansible_distribution_major_version' from source: facts 32134 1727204460.82344: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204460.82499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204460.85003: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204460.85070: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204460.85102: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204460.85140: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204460.85161: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204460.85232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204460.85261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204460.85282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204460.85317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204460.85330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204460.85419: variable 'ansible_distribution_major_version' from source: facts 32134 1727204460.85433: Evaluated conditional (ansible_distribution_major_version | int > 9): True 32134 1727204460.85538: variable 'ansible_distribution' from source: facts 32134 1727204460.85543: variable '__network_rh_distros' from source: role '' defaults 32134 1727204460.85552: Evaluated conditional (ansible_distribution in __network_rh_distros): False 32134 1727204460.85556: when evaluation is False, skipping this task 32134 1727204460.85563: _execute() done 32134 1727204460.85565: dumping result to json 32134 1727204460.85570: done dumping result, returning 32134 1727204460.85579: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-753f-5162-000000000088] 32134 1727204460.85586: sending task result for task 12b410aa-8751-753f-5162-000000000088 32134 1727204460.85687: done sending task result for task 12b410aa-8751-753f-5162-000000000088 32134 1727204460.85694: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 32134 1727204460.85745: no more pending results, returning what we have 32134 1727204460.85750: results queue empty 32134 1727204460.85751: checking for any_errors_fatal 32134 1727204460.85760: done checking for any_errors_fatal 32134 1727204460.85761: checking for max_fail_percentage 32134 1727204460.85762: done checking for max_fail_percentage 32134 1727204460.85763: checking to see if all hosts have failed and the running result is not ok 32134 1727204460.85764: done checking to see if all hosts have failed 32134 1727204460.85765: getting the remaining hosts for this loop 32134 1727204460.85766: done getting the remaining hosts for this loop 32134 1727204460.85771: getting the next task for host managed-node2 32134 1727204460.85777: done getting next task for host managed-node2 32134 1727204460.85781: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 32134 1727204460.85784: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204460.85801: getting variables 32134 1727204460.85803: in VariableManager get_vars() 32134 1727204460.85846: Calling all_inventory to load vars for managed-node2 32134 1727204460.85849: Calling groups_inventory to load vars for managed-node2 32134 1727204460.85852: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204460.85863: Calling all_plugins_play to load vars for managed-node2 32134 1727204460.85866: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204460.85870: Calling groups_plugins_play to load vars for managed-node2 32134 1727204460.88062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204460.89818: done with get_vars() 32134 1727204460.89857: done getting variables 32134 1727204460.89931: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:01:00 -0400 (0:00:00.087) 0:00:35.303 ***** 32134 1727204460.89969: entering _queue_task() for managed-node2/dnf 32134 1727204460.90352: worker is 1 (out of 1 available) 32134 1727204460.90366: exiting _queue_task() for managed-node2/dnf 32134 1727204460.90379: done queuing things up, now waiting for results queue to drain 32134 1727204460.90381: waiting for pending results... 32134 1727204460.90722: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 32134 1727204460.90995: in run() - task 12b410aa-8751-753f-5162-000000000089 32134 1727204460.90999: variable 'ansible_search_path' from source: unknown 32134 1727204460.91002: variable 'ansible_search_path' from source: unknown 32134 1727204460.91004: calling self._execute() 32134 1727204460.91035: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204460.91049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204460.91067: variable 'omit' from source: magic vars 32134 1727204460.91534: variable 'ansible_distribution_major_version' from source: facts 32134 1727204460.91559: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204460.91815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204460.94052: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204460.94109: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204460.94142: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204460.94175: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204460.94202: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204460.94273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204460.94299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204460.94324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204460.94358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204460.94371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204460.94481: variable 'ansible_distribution' from source: facts 32134 1727204460.94485: variable 'ansible_distribution_major_version' from source: facts 32134 1727204460.94496: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 32134 1727204460.94597: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204460.94716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204460.94737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204460.94763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204460.94797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204460.94810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204460.94848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204460.94871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204460.94895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204460.94928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204460.94941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204460.94978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204460.95000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204460.95023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204460.95056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204460.95069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204460.95208: variable 'network_connections' from source: play vars 32134 1727204460.95221: variable 'profile' from source: play vars 32134 1727204460.95394: variable 'profile' from source: play vars 32134 1727204460.95398: variable 'interface' from source: set_fact 32134 1727204460.95400: variable 'interface' from source: set_fact 32134 1727204460.95473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204460.95754: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204460.95809: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204460.95860: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204460.95914: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204460.95981: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204460.96017: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204460.96074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204460.96118: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204460.96192: variable '__network_team_connections_defined' from source: role '' defaults 32134 1727204460.96477: variable 'network_connections' from source: play vars 32134 1727204460.96481: variable 'profile' from source: play vars 32134 1727204460.96547: variable 'profile' from source: play vars 32134 1727204460.96550: variable 'interface' from source: set_fact 32134 1727204460.96601: variable 'interface' from source: set_fact 32134 1727204460.96628: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32134 1727204460.96632: when evaluation is False, skipping this task 32134 1727204460.96635: _execute() done 32134 1727204460.96638: dumping result to json 32134 1727204460.96640: done dumping result, returning 32134 1727204460.96695: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-753f-5162-000000000089] 32134 1727204460.96698: sending task result for task 12b410aa-8751-753f-5162-000000000089 32134 1727204460.96771: done sending task result for task 12b410aa-8751-753f-5162-000000000089 32134 1727204460.96774: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32134 1727204460.96834: no more pending results, returning what we have 32134 1727204460.96839: results queue empty 32134 1727204460.96839: checking for any_errors_fatal 32134 1727204460.96848: done checking for any_errors_fatal 32134 1727204460.96849: checking for max_fail_percentage 32134 1727204460.96851: done checking for max_fail_percentage 32134 1727204460.96852: checking to see if all hosts have failed and the running result is not ok 32134 1727204460.96853: done checking to see if all hosts have failed 32134 1727204460.96854: getting the remaining hosts for this loop 32134 1727204460.96855: done getting the remaining hosts for this loop 32134 1727204460.96859: getting the next task for host managed-node2 32134 1727204460.96865: done getting next task for host managed-node2 32134 1727204460.96870: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 32134 1727204460.96872: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204460.96890: getting variables 32134 1727204460.96892: in VariableManager get_vars() 32134 1727204460.96935: Calling all_inventory to load vars for managed-node2 32134 1727204460.96938: Calling groups_inventory to load vars for managed-node2 32134 1727204460.96941: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204460.96952: Calling all_plugins_play to load vars for managed-node2 32134 1727204460.96955: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204460.96959: Calling groups_plugins_play to load vars for managed-node2 32134 1727204461.02280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204461.03898: done with get_vars() 32134 1727204461.03934: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 32134 1727204461.03996: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:01:01 -0400 (0:00:00.140) 0:00:35.444 ***** 32134 1727204461.04019: entering _queue_task() for managed-node2/yum 32134 1727204461.04304: worker is 1 (out of 1 available) 32134 1727204461.04319: exiting _queue_task() for managed-node2/yum 32134 1727204461.04334: done queuing things up, now waiting for results queue to drain 32134 1727204461.04336: waiting for pending results... 32134 1727204461.04537: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 32134 1727204461.04632: in run() - task 12b410aa-8751-753f-5162-00000000008a 32134 1727204461.04645: variable 'ansible_search_path' from source: unknown 32134 1727204461.04649: variable 'ansible_search_path' from source: unknown 32134 1727204461.04684: calling self._execute() 32134 1727204461.04773: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204461.04784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204461.04796: variable 'omit' from source: magic vars 32134 1727204461.05146: variable 'ansible_distribution_major_version' from source: facts 32134 1727204461.05157: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204461.05317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204461.08026: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204461.08196: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204461.08200: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204461.08263: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204461.08305: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204461.08429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.08487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.08534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.08658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.08670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.08805: variable 'ansible_distribution_major_version' from source: facts 32134 1727204461.08833: Evaluated conditional (ansible_distribution_major_version | int < 8): False 32134 1727204461.08876: when evaluation is False, skipping this task 32134 1727204461.08884: _execute() done 32134 1727204461.08887: dumping result to json 32134 1727204461.08892: done dumping result, returning 32134 1727204461.08899: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-753f-5162-00000000008a] 32134 1727204461.08902: sending task result for task 12b410aa-8751-753f-5162-00000000008a skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 32134 1727204461.09095: no more pending results, returning what we have 32134 1727204461.09100: results queue empty 32134 1727204461.09101: checking for any_errors_fatal 32134 1727204461.09114: done checking for any_errors_fatal 32134 1727204461.09115: checking for max_fail_percentage 32134 1727204461.09118: done checking for max_fail_percentage 32134 1727204461.09119: checking to see if all hosts have failed and the running result is not ok 32134 1727204461.09120: done checking to see if all hosts have failed 32134 1727204461.09121: getting the remaining hosts for this loop 32134 1727204461.09123: done getting the remaining hosts for this loop 32134 1727204461.09128: getting the next task for host managed-node2 32134 1727204461.09136: done getting next task for host managed-node2 32134 1727204461.09144: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 32134 1727204461.09146: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204461.09163: getting variables 32134 1727204461.09165: in VariableManager get_vars() 32134 1727204461.09438: Calling all_inventory to load vars for managed-node2 32134 1727204461.09442: Calling groups_inventory to load vars for managed-node2 32134 1727204461.09446: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204461.09454: done sending task result for task 12b410aa-8751-753f-5162-00000000008a 32134 1727204461.09458: WORKER PROCESS EXITING 32134 1727204461.09470: Calling all_plugins_play to load vars for managed-node2 32134 1727204461.09475: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204461.09480: Calling groups_plugins_play to load vars for managed-node2 32134 1727204461.11260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204461.12904: done with get_vars() 32134 1727204461.12932: done getting variables 32134 1727204461.12982: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:01:01 -0400 (0:00:00.089) 0:00:35.534 ***** 32134 1727204461.13016: entering _queue_task() for managed-node2/fail 32134 1727204461.13286: worker is 1 (out of 1 available) 32134 1727204461.13303: exiting _queue_task() for managed-node2/fail 32134 1727204461.13319: done queuing things up, now waiting for results queue to drain 32134 1727204461.13321: waiting for pending results... 32134 1727204461.13522: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 32134 1727204461.13613: in run() - task 12b410aa-8751-753f-5162-00000000008b 32134 1727204461.13626: variable 'ansible_search_path' from source: unknown 32134 1727204461.13630: variable 'ansible_search_path' from source: unknown 32134 1727204461.13666: calling self._execute() 32134 1727204461.13750: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204461.13757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204461.13769: variable 'omit' from source: magic vars 32134 1727204461.14108: variable 'ansible_distribution_major_version' from source: facts 32134 1727204461.14121: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204461.14230: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204461.14401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204461.16484: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204461.16541: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204461.16580: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204461.16617: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204461.16639: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204461.16709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.16739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.16761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.16795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.16808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.16854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.16874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.16895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.16928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.16943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.16979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.17004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.17026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.17061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.17074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.17220: variable 'network_connections' from source: play vars 32134 1727204461.17232: variable 'profile' from source: play vars 32134 1727204461.17301: variable 'profile' from source: play vars 32134 1727204461.17304: variable 'interface' from source: set_fact 32134 1727204461.17357: variable 'interface' from source: set_fact 32134 1727204461.17424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204461.17557: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204461.17595: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204461.17621: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204461.17655: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204461.17691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204461.17717: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204461.17737: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.17758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204461.17803: variable '__network_team_connections_defined' from source: role '' defaults 32134 1727204461.18006: variable 'network_connections' from source: play vars 32134 1727204461.18013: variable 'profile' from source: play vars 32134 1727204461.18066: variable 'profile' from source: play vars 32134 1727204461.18070: variable 'interface' from source: set_fact 32134 1727204461.18122: variable 'interface' from source: set_fact 32134 1727204461.18151: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32134 1727204461.18155: when evaluation is False, skipping this task 32134 1727204461.18158: _execute() done 32134 1727204461.18161: dumping result to json 32134 1727204461.18163: done dumping result, returning 32134 1727204461.18173: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-753f-5162-00000000008b] 32134 1727204461.18184: sending task result for task 12b410aa-8751-753f-5162-00000000008b 32134 1727204461.18281: done sending task result for task 12b410aa-8751-753f-5162-00000000008b 32134 1727204461.18283: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32134 1727204461.18348: no more pending results, returning what we have 32134 1727204461.18352: results queue empty 32134 1727204461.18353: checking for any_errors_fatal 32134 1727204461.18361: done checking for any_errors_fatal 32134 1727204461.18362: checking for max_fail_percentage 32134 1727204461.18363: done checking for max_fail_percentage 32134 1727204461.18365: checking to see if all hosts have failed and the running result is not ok 32134 1727204461.18366: done checking to see if all hosts have failed 32134 1727204461.18367: getting the remaining hosts for this loop 32134 1727204461.18368: done getting the remaining hosts for this loop 32134 1727204461.18372: getting the next task for host managed-node2 32134 1727204461.18379: done getting next task for host managed-node2 32134 1727204461.18384: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 32134 1727204461.18386: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204461.18408: getting variables 32134 1727204461.18410: in VariableManager get_vars() 32134 1727204461.18453: Calling all_inventory to load vars for managed-node2 32134 1727204461.18457: Calling groups_inventory to load vars for managed-node2 32134 1727204461.18459: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204461.18470: Calling all_plugins_play to load vars for managed-node2 32134 1727204461.18473: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204461.18476: Calling groups_plugins_play to load vars for managed-node2 32134 1727204461.19934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204461.21567: done with get_vars() 32134 1727204461.21600: done getting variables 32134 1727204461.21659: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:01:01 -0400 (0:00:00.086) 0:00:35.620 ***** 32134 1727204461.21687: entering _queue_task() for managed-node2/package 32134 1727204461.21976: worker is 1 (out of 1 available) 32134 1727204461.21992: exiting _queue_task() for managed-node2/package 32134 1727204461.22007: done queuing things up, now waiting for results queue to drain 32134 1727204461.22009: waiting for pending results... 32134 1727204461.22208: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 32134 1727204461.22305: in run() - task 12b410aa-8751-753f-5162-00000000008c 32134 1727204461.22320: variable 'ansible_search_path' from source: unknown 32134 1727204461.22324: variable 'ansible_search_path' from source: unknown 32134 1727204461.22363: calling self._execute() 32134 1727204461.22450: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204461.22456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204461.22470: variable 'omit' from source: magic vars 32134 1727204461.22805: variable 'ansible_distribution_major_version' from source: facts 32134 1727204461.22817: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204461.22992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204461.23224: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204461.23268: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204461.23298: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204461.23365: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204461.23464: variable 'network_packages' from source: role '' defaults 32134 1727204461.23556: variable '__network_provider_setup' from source: role '' defaults 32134 1727204461.23571: variable '__network_service_name_default_nm' from source: role '' defaults 32134 1727204461.23630: variable '__network_service_name_default_nm' from source: role '' defaults 32134 1727204461.23639: variable '__network_packages_default_nm' from source: role '' defaults 32134 1727204461.23696: variable '__network_packages_default_nm' from source: role '' defaults 32134 1727204461.23854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204461.25434: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204461.25486: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204461.25518: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204461.25550: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204461.25582: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204461.25654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.25678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.25701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.25735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.25752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.25793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.25816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.25836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.25871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.25886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.26085: variable '__network_packages_default_gobject_packages' from source: role '' defaults 32134 1727204461.26186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.26291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.26296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.26299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.26302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.26354: variable 'ansible_python' from source: facts 32134 1727204461.26377: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 32134 1727204461.26449: variable '__network_wpa_supplicant_required' from source: role '' defaults 32134 1727204461.26518: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32134 1727204461.26626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.26649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.26669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.26701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.26716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.26759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.26783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.26805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.26838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.26852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.26972: variable 'network_connections' from source: play vars 32134 1727204461.26979: variable 'profile' from source: play vars 32134 1727204461.27062: variable 'profile' from source: play vars 32134 1727204461.27069: variable 'interface' from source: set_fact 32134 1727204461.27132: variable 'interface' from source: set_fact 32134 1727204461.27191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204461.27271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204461.27274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.27277: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204461.27303: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204461.27534: variable 'network_connections' from source: play vars 32134 1727204461.27540: variable 'profile' from source: play vars 32134 1727204461.27629: variable 'profile' from source: play vars 32134 1727204461.27633: variable 'interface' from source: set_fact 32134 1727204461.27687: variable 'interface' from source: set_fact 32134 1727204461.27718: variable '__network_packages_default_wireless' from source: role '' defaults 32134 1727204461.27785: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204461.28044: variable 'network_connections' from source: play vars 32134 1727204461.28049: variable 'profile' from source: play vars 32134 1727204461.28107: variable 'profile' from source: play vars 32134 1727204461.28113: variable 'interface' from source: set_fact 32134 1727204461.28198: variable 'interface' from source: set_fact 32134 1727204461.28221: variable '__network_packages_default_team' from source: role '' defaults 32134 1727204461.28290: variable '__network_team_connections_defined' from source: role '' defaults 32134 1727204461.28543: variable 'network_connections' from source: play vars 32134 1727204461.28547: variable 'profile' from source: play vars 32134 1727204461.28609: variable 'profile' from source: play vars 32134 1727204461.28616: variable 'interface' from source: set_fact 32134 1727204461.28688: variable 'interface' from source: set_fact 32134 1727204461.28741: variable '__network_service_name_default_initscripts' from source: role '' defaults 32134 1727204461.28793: variable '__network_service_name_default_initscripts' from source: role '' defaults 32134 1727204461.28830: variable '__network_packages_default_initscripts' from source: role '' defaults 32134 1727204461.28870: variable '__network_packages_default_initscripts' from source: role '' defaults 32134 1727204461.29062: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 32134 1727204461.29637: variable 'network_connections' from source: play vars 32134 1727204461.29641: variable 'profile' from source: play vars 32134 1727204461.29694: variable 'profile' from source: play vars 32134 1727204461.29698: variable 'interface' from source: set_fact 32134 1727204461.29757: variable 'interface' from source: set_fact 32134 1727204461.29765: variable 'ansible_distribution' from source: facts 32134 1727204461.29769: variable '__network_rh_distros' from source: role '' defaults 32134 1727204461.29779: variable 'ansible_distribution_major_version' from source: facts 32134 1727204461.29791: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 32134 1727204461.29935: variable 'ansible_distribution' from source: facts 32134 1727204461.29939: variable '__network_rh_distros' from source: role '' defaults 32134 1727204461.29945: variable 'ansible_distribution_major_version' from source: facts 32134 1727204461.29953: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 32134 1727204461.30093: variable 'ansible_distribution' from source: facts 32134 1727204461.30097: variable '__network_rh_distros' from source: role '' defaults 32134 1727204461.30105: variable 'ansible_distribution_major_version' from source: facts 32134 1727204461.30138: variable 'network_provider' from source: set_fact 32134 1727204461.30154: variable 'ansible_facts' from source: unknown 32134 1727204461.30868: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 32134 1727204461.30874: when evaluation is False, skipping this task 32134 1727204461.30877: _execute() done 32134 1727204461.30880: dumping result to json 32134 1727204461.30883: done dumping result, returning 32134 1727204461.30893: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-753f-5162-00000000008c] 32134 1727204461.30899: sending task result for task 12b410aa-8751-753f-5162-00000000008c 32134 1727204461.31004: done sending task result for task 12b410aa-8751-753f-5162-00000000008c 32134 1727204461.31007: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 32134 1727204461.31064: no more pending results, returning what we have 32134 1727204461.31068: results queue empty 32134 1727204461.31069: checking for any_errors_fatal 32134 1727204461.31077: done checking for any_errors_fatal 32134 1727204461.31078: checking for max_fail_percentage 32134 1727204461.31080: done checking for max_fail_percentage 32134 1727204461.31081: checking to see if all hosts have failed and the running result is not ok 32134 1727204461.31082: done checking to see if all hosts have failed 32134 1727204461.31083: getting the remaining hosts for this loop 32134 1727204461.31084: done getting the remaining hosts for this loop 32134 1727204461.31088: getting the next task for host managed-node2 32134 1727204461.31098: done getting next task for host managed-node2 32134 1727204461.31103: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 32134 1727204461.31105: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204461.31121: getting variables 32134 1727204461.31123: in VariableManager get_vars() 32134 1727204461.31165: Calling all_inventory to load vars for managed-node2 32134 1727204461.31168: Calling groups_inventory to load vars for managed-node2 32134 1727204461.31170: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204461.31187: Calling all_plugins_play to load vars for managed-node2 32134 1727204461.31199: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204461.31203: Calling groups_plugins_play to load vars for managed-node2 32134 1727204461.32760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204461.34946: done with get_vars() 32134 1727204461.34975: done getting variables 32134 1727204461.35034: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:01:01 -0400 (0:00:00.133) 0:00:35.754 ***** 32134 1727204461.35059: entering _queue_task() for managed-node2/package 32134 1727204461.35334: worker is 1 (out of 1 available) 32134 1727204461.35347: exiting _queue_task() for managed-node2/package 32134 1727204461.35362: done queuing things up, now waiting for results queue to drain 32134 1727204461.35364: waiting for pending results... 32134 1727204461.35572: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 32134 1727204461.35663: in run() - task 12b410aa-8751-753f-5162-00000000008d 32134 1727204461.35676: variable 'ansible_search_path' from source: unknown 32134 1727204461.35680: variable 'ansible_search_path' from source: unknown 32134 1727204461.35719: calling self._execute() 32134 1727204461.35803: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204461.35810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204461.35825: variable 'omit' from source: magic vars 32134 1727204461.36543: variable 'ansible_distribution_major_version' from source: facts 32134 1727204461.36547: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204461.36669: variable 'network_state' from source: role '' defaults 32134 1727204461.36688: Evaluated conditional (network_state != {}): False 32134 1727204461.36700: when evaluation is False, skipping this task 32134 1727204461.36710: _execute() done 32134 1727204461.36721: dumping result to json 32134 1727204461.36731: done dumping result, returning 32134 1727204461.36744: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-753f-5162-00000000008d] 32134 1727204461.36758: sending task result for task 12b410aa-8751-753f-5162-00000000008d skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32134 1727204461.36950: no more pending results, returning what we have 32134 1727204461.36955: results queue empty 32134 1727204461.36956: checking for any_errors_fatal 32134 1727204461.36964: done checking for any_errors_fatal 32134 1727204461.36965: checking for max_fail_percentage 32134 1727204461.36967: done checking for max_fail_percentage 32134 1727204461.36968: checking to see if all hosts have failed and the running result is not ok 32134 1727204461.36969: done checking to see if all hosts have failed 32134 1727204461.36970: getting the remaining hosts for this loop 32134 1727204461.36971: done getting the remaining hosts for this loop 32134 1727204461.36975: getting the next task for host managed-node2 32134 1727204461.36981: done getting next task for host managed-node2 32134 1727204461.36986: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 32134 1727204461.36988: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204461.37012: getting variables 32134 1727204461.37014: in VariableManager get_vars() 32134 1727204461.37054: Calling all_inventory to load vars for managed-node2 32134 1727204461.37058: Calling groups_inventory to load vars for managed-node2 32134 1727204461.37060: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204461.37074: Calling all_plugins_play to load vars for managed-node2 32134 1727204461.37078: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204461.37082: Calling groups_plugins_play to load vars for managed-node2 32134 1727204461.37720: done sending task result for task 12b410aa-8751-753f-5162-00000000008d 32134 1727204461.37723: WORKER PROCESS EXITING 32134 1727204461.39531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204461.42672: done with get_vars() 32134 1727204461.42723: done getting variables 32134 1727204461.42801: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:01:01 -0400 (0:00:00.077) 0:00:35.832 ***** 32134 1727204461.42841: entering _queue_task() for managed-node2/package 32134 1727204461.43233: worker is 1 (out of 1 available) 32134 1727204461.43250: exiting _queue_task() for managed-node2/package 32134 1727204461.43265: done queuing things up, now waiting for results queue to drain 32134 1727204461.43267: waiting for pending results... 32134 1727204461.43714: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 32134 1727204461.43770: in run() - task 12b410aa-8751-753f-5162-00000000008e 32134 1727204461.43795: variable 'ansible_search_path' from source: unknown 32134 1727204461.43806: variable 'ansible_search_path' from source: unknown 32134 1727204461.43859: calling self._execute() 32134 1727204461.43987: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204461.44006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204461.44028: variable 'omit' from source: magic vars 32134 1727204461.44523: variable 'ansible_distribution_major_version' from source: facts 32134 1727204461.44544: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204461.44715: variable 'network_state' from source: role '' defaults 32134 1727204461.44895: Evaluated conditional (network_state != {}): False 32134 1727204461.44898: when evaluation is False, skipping this task 32134 1727204461.44901: _execute() done 32134 1727204461.44903: dumping result to json 32134 1727204461.44906: done dumping result, returning 32134 1727204461.44909: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-753f-5162-00000000008e] 32134 1727204461.44914: sending task result for task 12b410aa-8751-753f-5162-00000000008e 32134 1727204461.44998: done sending task result for task 12b410aa-8751-753f-5162-00000000008e 32134 1727204461.45002: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32134 1727204461.45060: no more pending results, returning what we have 32134 1727204461.45066: results queue empty 32134 1727204461.45067: checking for any_errors_fatal 32134 1727204461.45077: done checking for any_errors_fatal 32134 1727204461.45078: checking for max_fail_percentage 32134 1727204461.45080: done checking for max_fail_percentage 32134 1727204461.45081: checking to see if all hosts have failed and the running result is not ok 32134 1727204461.45083: done checking to see if all hosts have failed 32134 1727204461.45084: getting the remaining hosts for this loop 32134 1727204461.45085: done getting the remaining hosts for this loop 32134 1727204461.45092: getting the next task for host managed-node2 32134 1727204461.45100: done getting next task for host managed-node2 32134 1727204461.45106: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 32134 1727204461.45109: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204461.45132: getting variables 32134 1727204461.45134: in VariableManager get_vars() 32134 1727204461.45182: Calling all_inventory to load vars for managed-node2 32134 1727204461.45186: Calling groups_inventory to load vars for managed-node2 32134 1727204461.45392: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204461.45406: Calling all_plugins_play to load vars for managed-node2 32134 1727204461.45413: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204461.45418: Calling groups_plugins_play to load vars for managed-node2 32134 1727204461.47913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204461.50925: done with get_vars() 32134 1727204461.50980: done getting variables 32134 1727204461.51051: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:01:01 -0400 (0:00:00.082) 0:00:35.914 ***** 32134 1727204461.51085: entering _queue_task() for managed-node2/service 32134 1727204461.51471: worker is 1 (out of 1 available) 32134 1727204461.51486: exiting _queue_task() for managed-node2/service 32134 1727204461.51701: done queuing things up, now waiting for results queue to drain 32134 1727204461.51704: waiting for pending results... 32134 1727204461.51909: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 32134 1727204461.51963: in run() - task 12b410aa-8751-753f-5162-00000000008f 32134 1727204461.51985: variable 'ansible_search_path' from source: unknown 32134 1727204461.51997: variable 'ansible_search_path' from source: unknown 32134 1727204461.52047: calling self._execute() 32134 1727204461.52171: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204461.52185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204461.52255: variable 'omit' from source: magic vars 32134 1727204461.52683: variable 'ansible_distribution_major_version' from source: facts 32134 1727204461.52713: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204461.52877: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204461.53164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204461.55944: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204461.56065: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204461.56100: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204461.56149: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204461.56395: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204461.56399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.56402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.56405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.56425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.56447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.56517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.56554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.56593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.56649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.56668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.56722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.56794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.56797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.56837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.56860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.57120: variable 'network_connections' from source: play vars 32134 1727204461.57143: variable 'profile' from source: play vars 32134 1727204461.57256: variable 'profile' from source: play vars 32134 1727204461.57286: variable 'interface' from source: set_fact 32134 1727204461.57360: variable 'interface' from source: set_fact 32134 1727204461.57507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204461.57744: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204461.57801: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204461.57946: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204461.57949: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204461.57958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204461.57994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204461.58035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.58078: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204461.58147: variable '__network_team_connections_defined' from source: role '' defaults 32134 1727204461.58506: variable 'network_connections' from source: play vars 32134 1727204461.58522: variable 'profile' from source: play vars 32134 1727204461.58606: variable 'profile' from source: play vars 32134 1727204461.58620: variable 'interface' from source: set_fact 32134 1727204461.58703: variable 'interface' from source: set_fact 32134 1727204461.58746: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32134 1727204461.58756: when evaluation is False, skipping this task 32134 1727204461.58765: _execute() done 32134 1727204461.58774: dumping result to json 32134 1727204461.58782: done dumping result, returning 32134 1727204461.58798: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-753f-5162-00000000008f] 32134 1727204461.58928: sending task result for task 12b410aa-8751-753f-5162-00000000008f 32134 1727204461.59007: done sending task result for task 12b410aa-8751-753f-5162-00000000008f 32134 1727204461.59013: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32134 1727204461.59081: no more pending results, returning what we have 32134 1727204461.59085: results queue empty 32134 1727204461.59086: checking for any_errors_fatal 32134 1727204461.59095: done checking for any_errors_fatal 32134 1727204461.59096: checking for max_fail_percentage 32134 1727204461.59098: done checking for max_fail_percentage 32134 1727204461.59099: checking to see if all hosts have failed and the running result is not ok 32134 1727204461.59100: done checking to see if all hosts have failed 32134 1727204461.59101: getting the remaining hosts for this loop 32134 1727204461.59102: done getting the remaining hosts for this loop 32134 1727204461.59107: getting the next task for host managed-node2 32134 1727204461.59116: done getting next task for host managed-node2 32134 1727204461.59120: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 32134 1727204461.59123: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204461.59140: getting variables 32134 1727204461.59142: in VariableManager get_vars() 32134 1727204461.59187: Calling all_inventory to load vars for managed-node2 32134 1727204461.59296: Calling groups_inventory to load vars for managed-node2 32134 1727204461.59300: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204461.59315: Calling all_plugins_play to load vars for managed-node2 32134 1727204461.59319: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204461.59323: Calling groups_plugins_play to load vars for managed-node2 32134 1727204461.61917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204461.64997: done with get_vars() 32134 1727204461.65035: done getting variables 32134 1727204461.65102: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:01:01 -0400 (0:00:00.140) 0:00:36.055 ***** 32134 1727204461.65135: entering _queue_task() for managed-node2/service 32134 1727204461.65725: worker is 1 (out of 1 available) 32134 1727204461.65738: exiting _queue_task() for managed-node2/service 32134 1727204461.65750: done queuing things up, now waiting for results queue to drain 32134 1727204461.65752: waiting for pending results... 32134 1727204461.65914: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 32134 1727204461.66091: in run() - task 12b410aa-8751-753f-5162-000000000090 32134 1727204461.66097: variable 'ansible_search_path' from source: unknown 32134 1727204461.66100: variable 'ansible_search_path' from source: unknown 32134 1727204461.66197: calling self._execute() 32134 1727204461.66274: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204461.66291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204461.66318: variable 'omit' from source: magic vars 32134 1727204461.67159: variable 'ansible_distribution_major_version' from source: facts 32134 1727204461.67164: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204461.67167: variable 'network_provider' from source: set_fact 32134 1727204461.67169: variable 'network_state' from source: role '' defaults 32134 1727204461.67171: Evaluated conditional (network_provider == "nm" or network_state != {}): True 32134 1727204461.67174: variable 'omit' from source: magic vars 32134 1727204461.67395: variable 'omit' from source: magic vars 32134 1727204461.67399: variable 'network_service_name' from source: role '' defaults 32134 1727204461.67402: variable 'network_service_name' from source: role '' defaults 32134 1727204461.67516: variable '__network_provider_setup' from source: role '' defaults 32134 1727204461.67531: variable '__network_service_name_default_nm' from source: role '' defaults 32134 1727204461.67632: variable '__network_service_name_default_nm' from source: role '' defaults 32134 1727204461.67635: variable '__network_packages_default_nm' from source: role '' defaults 32134 1727204461.67710: variable '__network_packages_default_nm' from source: role '' defaults 32134 1727204461.68066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204461.70748: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204461.70852: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204461.70902: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204461.71007: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204461.71010: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204461.71087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.71139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.71175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.71240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.71261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.71329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.71367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.71446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.71463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.71484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.71822: variable '__network_packages_default_gobject_packages' from source: role '' defaults 32134 1727204461.71992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.72032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.72067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.72128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.72194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.72274: variable 'ansible_python' from source: facts 32134 1727204461.72307: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 32134 1727204461.72428: variable '__network_wpa_supplicant_required' from source: role '' defaults 32134 1727204461.72538: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32134 1727204461.72714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.72756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.72863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.72867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.72869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.72937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204461.72984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204461.73026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.73076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204461.73101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204461.73297: variable 'network_connections' from source: play vars 32134 1727204461.73300: variable 'profile' from source: play vars 32134 1727204461.73385: variable 'profile' from source: play vars 32134 1727204461.73406: variable 'interface' from source: set_fact 32134 1727204461.73488: variable 'interface' from source: set_fact 32134 1727204461.73650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204461.73930: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204461.74196: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204461.74199: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204461.74202: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204461.74209: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204461.74255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204461.74303: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204461.74359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204461.74432: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204461.74860: variable 'network_connections' from source: play vars 32134 1727204461.74878: variable 'profile' from source: play vars 32134 1727204461.74995: variable 'profile' from source: play vars 32134 1727204461.74999: variable 'interface' from source: set_fact 32134 1727204461.75086: variable 'interface' from source: set_fact 32134 1727204461.75197: variable '__network_packages_default_wireless' from source: role '' defaults 32134 1727204461.75247: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204461.75670: variable 'network_connections' from source: play vars 32134 1727204461.75682: variable 'profile' from source: play vars 32134 1727204461.75781: variable 'profile' from source: play vars 32134 1727204461.75796: variable 'interface' from source: set_fact 32134 1727204461.75899: variable 'interface' from source: set_fact 32134 1727204461.75940: variable '__network_packages_default_team' from source: role '' defaults 32134 1727204461.76053: variable '__network_team_connections_defined' from source: role '' defaults 32134 1727204461.76471: variable 'network_connections' from source: play vars 32134 1727204461.76482: variable 'profile' from source: play vars 32134 1727204461.76575: variable 'profile' from source: play vars 32134 1727204461.76597: variable 'interface' from source: set_fact 32134 1727204461.76685: variable 'interface' from source: set_fact 32134 1727204461.76834: variable '__network_service_name_default_initscripts' from source: role '' defaults 32134 1727204461.76854: variable '__network_service_name_default_initscripts' from source: role '' defaults 32134 1727204461.76867: variable '__network_packages_default_initscripts' from source: role '' defaults 32134 1727204461.76954: variable '__network_packages_default_initscripts' from source: role '' defaults 32134 1727204461.77264: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 32134 1727204461.77998: variable 'network_connections' from source: play vars 32134 1727204461.78010: variable 'profile' from source: play vars 32134 1727204461.78099: variable 'profile' from source: play vars 32134 1727204461.78110: variable 'interface' from source: set_fact 32134 1727204461.78205: variable 'interface' from source: set_fact 32134 1727204461.78253: variable 'ansible_distribution' from source: facts 32134 1727204461.78257: variable '__network_rh_distros' from source: role '' defaults 32134 1727204461.78259: variable 'ansible_distribution_major_version' from source: facts 32134 1727204461.78267: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 32134 1727204461.78518: variable 'ansible_distribution' from source: facts 32134 1727204461.78529: variable '__network_rh_distros' from source: role '' defaults 32134 1727204461.78580: variable 'ansible_distribution_major_version' from source: facts 32134 1727204461.78583: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 32134 1727204461.78798: variable 'ansible_distribution' from source: facts 32134 1727204461.78810: variable '__network_rh_distros' from source: role '' defaults 32134 1727204461.78825: variable 'ansible_distribution_major_version' from source: facts 32134 1727204461.78874: variable 'network_provider' from source: set_fact 32134 1727204461.78918: variable 'omit' from source: magic vars 32134 1727204461.78994: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204461.78998: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204461.79030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204461.79057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204461.79075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204461.79126: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204461.79194: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204461.79198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204461.79285: Set connection var ansible_timeout to 10 32134 1727204461.79314: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204461.79324: Set connection var ansible_connection to ssh 32134 1727204461.79332: Set connection var ansible_shell_type to sh 32134 1727204461.79349: Set connection var ansible_shell_executable to /bin/sh 32134 1727204461.79361: Set connection var ansible_pipelining to False 32134 1727204461.79398: variable 'ansible_shell_executable' from source: unknown 32134 1727204461.79407: variable 'ansible_connection' from source: unknown 32134 1727204461.79420: variable 'ansible_module_compression' from source: unknown 32134 1727204461.79428: variable 'ansible_shell_type' from source: unknown 32134 1727204461.79451: variable 'ansible_shell_executable' from source: unknown 32134 1727204461.79454: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204461.79461: variable 'ansible_pipelining' from source: unknown 32134 1727204461.79595: variable 'ansible_timeout' from source: unknown 32134 1727204461.79598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204461.79619: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204461.79638: variable 'omit' from source: magic vars 32134 1727204461.79648: starting attempt loop 32134 1727204461.79656: running the handler 32134 1727204461.79765: variable 'ansible_facts' from source: unknown 32134 1727204461.81071: _low_level_execute_command(): starting 32134 1727204461.81085: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204461.81861: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204461.81879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204461.81913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204461.82008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204461.82046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204461.82064: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204461.82091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204461.82165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204461.84030: stdout chunk (state=3): >>>/root <<< 32134 1727204461.84226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204461.84230: stdout chunk (state=3): >>><<< 32134 1727204461.84233: stderr chunk (state=3): >>><<< 32134 1727204461.84263: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204461.84284: _low_level_execute_command(): starting 32134 1727204461.84299: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204461.8427055-33783-145959074664183 `" && echo ansible-tmp-1727204461.8427055-33783-145959074664183="` echo /root/.ansible/tmp/ansible-tmp-1727204461.8427055-33783-145959074664183 `" ) && sleep 0' 32134 1727204461.84989: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204461.85117: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204461.85144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204461.85225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204461.87392: stdout chunk (state=3): >>>ansible-tmp-1727204461.8427055-33783-145959074664183=/root/.ansible/tmp/ansible-tmp-1727204461.8427055-33783-145959074664183 <<< 32134 1727204461.87615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204461.87618: stdout chunk (state=3): >>><<< 32134 1727204461.87621: stderr chunk (state=3): >>><<< 32134 1727204461.87640: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204461.8427055-33783-145959074664183=/root/.ansible/tmp/ansible-tmp-1727204461.8427055-33783-145959074664183 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204461.87725: variable 'ansible_module_compression' from source: unknown 32134 1727204461.87797: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 32134 1727204461.88031: variable 'ansible_facts' from source: unknown 32134 1727204461.88143: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204461.8427055-33783-145959074664183/AnsiballZ_systemd.py 32134 1727204461.88346: Sending initial data 32134 1727204461.88364: Sent initial data (156 bytes) 32134 1727204461.89079: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204461.89103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204461.89162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204461.89176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204461.89227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204461.90987: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204461.91037: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204461.91096: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpkjm6_mvv /root/.ansible/tmp/ansible-tmp-1727204461.8427055-33783-145959074664183/AnsiballZ_systemd.py <<< 32134 1727204461.91100: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204461.8427055-33783-145959074664183/AnsiballZ_systemd.py" <<< 32134 1727204461.91155: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpkjm6_mvv" to remote "/root/.ansible/tmp/ansible-tmp-1727204461.8427055-33783-145959074664183/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204461.8427055-33783-145959074664183/AnsiballZ_systemd.py" <<< 32134 1727204461.93044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204461.93158: stderr chunk (state=3): >>><<< 32134 1727204461.93162: stdout chunk (state=3): >>><<< 32134 1727204461.93165: done transferring module to remote 32134 1727204461.93167: _low_level_execute_command(): starting 32134 1727204461.93169: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204461.8427055-33783-145959074664183/ /root/.ansible/tmp/ansible-tmp-1727204461.8427055-33783-145959074664183/AnsiballZ_systemd.py && sleep 0' 32134 1727204461.93652: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204461.93656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32134 1727204461.93659: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204461.93662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204461.93709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204461.93717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204461.93763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204461.95741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204461.95795: stderr chunk (state=3): >>><<< 32134 1727204461.95799: stdout chunk (state=3): >>><<< 32134 1727204461.95808: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204461.95812: _low_level_execute_command(): starting 32134 1727204461.95821: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204461.8427055-33783-145959074664183/AnsiballZ_systemd.py && sleep 0' 32134 1727204461.96262: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204461.96266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204461.96302: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204461.96306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204461.96364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204461.96368: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204461.96429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204462.30685: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4468736", "MemoryAvailable": "infinity", "CPUUsageNSec": "1602330000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 32134 1727204462.30699: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": <<< 32134 1727204462.30713: stdout chunk (state=3): >>>"loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 32134 1727204462.32896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204462.32961: stderr chunk (state=3): >>><<< 32134 1727204462.32965: stdout chunk (state=3): >>><<< 32134 1727204462.32985: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4468736", "MemoryAvailable": "infinity", "CPUUsageNSec": "1602330000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204462.33156: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204461.8427055-33783-145959074664183/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204462.33177: _low_level_execute_command(): starting 32134 1727204462.33183: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204461.8427055-33783-145959074664183/ > /dev/null 2>&1 && sleep 0' 32134 1727204462.33674: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204462.33679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204462.33682: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204462.33684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204462.33687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204462.33743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204462.33746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204462.33795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204462.35835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204462.35898: stderr chunk (state=3): >>><<< 32134 1727204462.35902: stdout chunk (state=3): >>><<< 32134 1727204462.35917: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204462.35927: handler run complete 32134 1727204462.35982: attempt loop complete, returning result 32134 1727204462.35985: _execute() done 32134 1727204462.35988: dumping result to json 32134 1727204462.36008: done dumping result, returning 32134 1727204462.36018: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-753f-5162-000000000090] 32134 1727204462.36023: sending task result for task 12b410aa-8751-753f-5162-000000000090 32134 1727204462.36314: done sending task result for task 12b410aa-8751-753f-5162-000000000090 32134 1727204462.36317: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32134 1727204462.36423: no more pending results, returning what we have 32134 1727204462.36427: results queue empty 32134 1727204462.36431: checking for any_errors_fatal 32134 1727204462.36439: done checking for any_errors_fatal 32134 1727204462.36440: checking for max_fail_percentage 32134 1727204462.36442: done checking for max_fail_percentage 32134 1727204462.36443: checking to see if all hosts have failed and the running result is not ok 32134 1727204462.36444: done checking to see if all hosts have failed 32134 1727204462.36445: getting the remaining hosts for this loop 32134 1727204462.36446: done getting the remaining hosts for this loop 32134 1727204462.36455: getting the next task for host managed-node2 32134 1727204462.36513: done getting next task for host managed-node2 32134 1727204462.36521: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 32134 1727204462.36524: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204462.36564: getting variables 32134 1727204462.36571: in VariableManager get_vars() 32134 1727204462.36609: Calling all_inventory to load vars for managed-node2 32134 1727204462.36613: Calling groups_inventory to load vars for managed-node2 32134 1727204462.36615: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204462.36624: Calling all_plugins_play to load vars for managed-node2 32134 1727204462.36628: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204462.36632: Calling groups_plugins_play to load vars for managed-node2 32134 1727204462.37903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204462.40345: done with get_vars() 32134 1727204462.40369: done getting variables 32134 1727204462.40428: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:01:02 -0400 (0:00:00.753) 0:00:36.808 ***** 32134 1727204462.40455: entering _queue_task() for managed-node2/service 32134 1727204462.40740: worker is 1 (out of 1 available) 32134 1727204462.40756: exiting _queue_task() for managed-node2/service 32134 1727204462.40771: done queuing things up, now waiting for results queue to drain 32134 1727204462.40773: waiting for pending results... 32134 1727204462.40970: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 32134 1727204462.41064: in run() - task 12b410aa-8751-753f-5162-000000000091 32134 1727204462.41078: variable 'ansible_search_path' from source: unknown 32134 1727204462.41081: variable 'ansible_search_path' from source: unknown 32134 1727204462.41121: calling self._execute() 32134 1727204462.41205: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204462.41215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204462.41229: variable 'omit' from source: magic vars 32134 1727204462.41732: variable 'ansible_distribution_major_version' from source: facts 32134 1727204462.41737: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204462.42275: variable 'network_provider' from source: set_fact 32134 1727204462.42279: Evaluated conditional (network_provider == "nm"): True 32134 1727204462.42282: variable '__network_wpa_supplicant_required' from source: role '' defaults 32134 1727204462.42394: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32134 1727204462.42618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204462.45157: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204462.45250: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204462.45299: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204462.45344: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204462.45379: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204462.45491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204462.45544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204462.45582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204462.45639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204462.45660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204462.45725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204462.45759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204462.45794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204462.45894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204462.45898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204462.45924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204462.45958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204462.45995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204462.46047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204462.46067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204462.46257: variable 'network_connections' from source: play vars 32134 1727204462.46279: variable 'profile' from source: play vars 32134 1727204462.46380: variable 'profile' from source: play vars 32134 1727204462.46494: variable 'interface' from source: set_fact 32134 1727204462.46497: variable 'interface' from source: set_fact 32134 1727204462.46567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32134 1727204462.46777: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32134 1727204462.46831: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32134 1727204462.46872: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32134 1727204462.46912: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32134 1727204462.46967: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32134 1727204462.46999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32134 1727204462.47036: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204462.47072: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32134 1727204462.47133: variable '__network_wireless_connections_defined' from source: role '' defaults 32134 1727204462.47480: variable 'network_connections' from source: play vars 32134 1727204462.47496: variable 'profile' from source: play vars 32134 1727204462.47578: variable 'profile' from source: play vars 32134 1727204462.47591: variable 'interface' from source: set_fact 32134 1727204462.47669: variable 'interface' from source: set_fact 32134 1727204462.47712: Evaluated conditional (__network_wpa_supplicant_required): False 32134 1727204462.47895: when evaluation is False, skipping this task 32134 1727204462.47899: _execute() done 32134 1727204462.47912: dumping result to json 32134 1727204462.47914: done dumping result, returning 32134 1727204462.47917: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-753f-5162-000000000091] 32134 1727204462.47920: sending task result for task 12b410aa-8751-753f-5162-000000000091 32134 1727204462.47996: done sending task result for task 12b410aa-8751-753f-5162-000000000091 32134 1727204462.47999: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 32134 1727204462.48065: no more pending results, returning what we have 32134 1727204462.48069: results queue empty 32134 1727204462.48070: checking for any_errors_fatal 32134 1727204462.48103: done checking for any_errors_fatal 32134 1727204462.48105: checking for max_fail_percentage 32134 1727204462.48107: done checking for max_fail_percentage 32134 1727204462.48108: checking to see if all hosts have failed and the running result is not ok 32134 1727204462.48110: done checking to see if all hosts have failed 32134 1727204462.48113: getting the remaining hosts for this loop 32134 1727204462.48114: done getting the remaining hosts for this loop 32134 1727204462.48119: getting the next task for host managed-node2 32134 1727204462.48125: done getting next task for host managed-node2 32134 1727204462.48129: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 32134 1727204462.48132: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204462.48146: getting variables 32134 1727204462.48148: in VariableManager get_vars() 32134 1727204462.48186: Calling all_inventory to load vars for managed-node2 32134 1727204462.48263: Calling groups_inventory to load vars for managed-node2 32134 1727204462.48268: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204462.48279: Calling all_plugins_play to load vars for managed-node2 32134 1727204462.48283: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204462.48287: Calling groups_plugins_play to load vars for managed-node2 32134 1727204462.50673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204462.53713: done with get_vars() 32134 1727204462.53763: done getting variables 32134 1727204462.53844: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:01:02 -0400 (0:00:00.134) 0:00:36.942 ***** 32134 1727204462.53881: entering _queue_task() for managed-node2/service 32134 1727204462.54262: worker is 1 (out of 1 available) 32134 1727204462.54277: exiting _queue_task() for managed-node2/service 32134 1727204462.54494: done queuing things up, now waiting for results queue to drain 32134 1727204462.54496: waiting for pending results... 32134 1727204462.54600: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 32134 1727204462.54732: in run() - task 12b410aa-8751-753f-5162-000000000092 32134 1727204462.54755: variable 'ansible_search_path' from source: unknown 32134 1727204462.54763: variable 'ansible_search_path' from source: unknown 32134 1727204462.54809: calling self._execute() 32134 1727204462.54924: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204462.54943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204462.54959: variable 'omit' from source: magic vars 32134 1727204462.55413: variable 'ansible_distribution_major_version' from source: facts 32134 1727204462.55480: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204462.55594: variable 'network_provider' from source: set_fact 32134 1727204462.55607: Evaluated conditional (network_provider == "initscripts"): False 32134 1727204462.55615: when evaluation is False, skipping this task 32134 1727204462.55623: _execute() done 32134 1727204462.55632: dumping result to json 32134 1727204462.55640: done dumping result, returning 32134 1727204462.55651: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-753f-5162-000000000092] 32134 1727204462.55663: sending task result for task 12b410aa-8751-753f-5162-000000000092 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32134 1727204462.55943: no more pending results, returning what we have 32134 1727204462.55948: results queue empty 32134 1727204462.55949: checking for any_errors_fatal 32134 1727204462.55962: done checking for any_errors_fatal 32134 1727204462.55963: checking for max_fail_percentage 32134 1727204462.55964: done checking for max_fail_percentage 32134 1727204462.55966: checking to see if all hosts have failed and the running result is not ok 32134 1727204462.55967: done checking to see if all hosts have failed 32134 1727204462.55968: getting the remaining hosts for this loop 32134 1727204462.55969: done getting the remaining hosts for this loop 32134 1727204462.55974: getting the next task for host managed-node2 32134 1727204462.55982: done getting next task for host managed-node2 32134 1727204462.55986: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 32134 1727204462.55991: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204462.56010: getting variables 32134 1727204462.56012: in VariableManager get_vars() 32134 1727204462.56055: Calling all_inventory to load vars for managed-node2 32134 1727204462.56059: Calling groups_inventory to load vars for managed-node2 32134 1727204462.56062: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204462.56077: Calling all_plugins_play to load vars for managed-node2 32134 1727204462.56081: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204462.56085: Calling groups_plugins_play to load vars for managed-node2 32134 1727204462.57008: done sending task result for task 12b410aa-8751-753f-5162-000000000092 32134 1727204462.57012: WORKER PROCESS EXITING 32134 1727204462.58914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204462.62009: done with get_vars() 32134 1727204462.62057: done getting variables 32134 1727204462.62133: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:01:02 -0400 (0:00:00.082) 0:00:37.025 ***** 32134 1727204462.62170: entering _queue_task() for managed-node2/copy 32134 1727204462.62548: worker is 1 (out of 1 available) 32134 1727204462.62563: exiting _queue_task() for managed-node2/copy 32134 1727204462.62576: done queuing things up, now waiting for results queue to drain 32134 1727204462.62577: waiting for pending results... 32134 1727204462.62888: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 32134 1727204462.63019: in run() - task 12b410aa-8751-753f-5162-000000000093 32134 1727204462.63041: variable 'ansible_search_path' from source: unknown 32134 1727204462.63050: variable 'ansible_search_path' from source: unknown 32134 1727204462.63094: calling self._execute() 32134 1727204462.63214: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204462.63233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204462.63252: variable 'omit' from source: magic vars 32134 1727204462.63704: variable 'ansible_distribution_major_version' from source: facts 32134 1727204462.63724: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204462.63880: variable 'network_provider' from source: set_fact 32134 1727204462.63894: Evaluated conditional (network_provider == "initscripts"): False 32134 1727204462.63903: when evaluation is False, skipping this task 32134 1727204462.63910: _execute() done 32134 1727204462.63919: dumping result to json 32134 1727204462.63930: done dumping result, returning 32134 1727204462.63945: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-753f-5162-000000000093] 32134 1727204462.63957: sending task result for task 12b410aa-8751-753f-5162-000000000093 32134 1727204462.64236: done sending task result for task 12b410aa-8751-753f-5162-000000000093 32134 1727204462.64239: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 32134 1727204462.64292: no more pending results, returning what we have 32134 1727204462.64297: results queue empty 32134 1727204462.64298: checking for any_errors_fatal 32134 1727204462.64306: done checking for any_errors_fatal 32134 1727204462.64307: checking for max_fail_percentage 32134 1727204462.64309: done checking for max_fail_percentage 32134 1727204462.64310: checking to see if all hosts have failed and the running result is not ok 32134 1727204462.64311: done checking to see if all hosts have failed 32134 1727204462.64312: getting the remaining hosts for this loop 32134 1727204462.64314: done getting the remaining hosts for this loop 32134 1727204462.64318: getting the next task for host managed-node2 32134 1727204462.64325: done getting next task for host managed-node2 32134 1727204462.64330: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 32134 1727204462.64332: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204462.64351: getting variables 32134 1727204462.64353: in VariableManager get_vars() 32134 1727204462.64397: Calling all_inventory to load vars for managed-node2 32134 1727204462.64401: Calling groups_inventory to load vars for managed-node2 32134 1727204462.64404: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204462.64417: Calling all_plugins_play to load vars for managed-node2 32134 1727204462.64421: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204462.64425: Calling groups_plugins_play to load vars for managed-node2 32134 1727204462.66820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204462.69941: done with get_vars() 32134 1727204462.69980: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:01:02 -0400 (0:00:00.079) 0:00:37.104 ***** 32134 1727204462.70080: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 32134 1727204462.70452: worker is 1 (out of 1 available) 32134 1727204462.70468: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 32134 1727204462.70483: done queuing things up, now waiting for results queue to drain 32134 1727204462.70485: waiting for pending results... 32134 1727204462.70788: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 32134 1727204462.71026: in run() - task 12b410aa-8751-753f-5162-000000000094 32134 1727204462.71030: variable 'ansible_search_path' from source: unknown 32134 1727204462.71033: variable 'ansible_search_path' from source: unknown 32134 1727204462.71036: calling self._execute() 32134 1727204462.71108: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204462.71124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204462.71149: variable 'omit' from source: magic vars 32134 1727204462.71612: variable 'ansible_distribution_major_version' from source: facts 32134 1727204462.71632: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204462.71644: variable 'omit' from source: magic vars 32134 1727204462.71787: variable 'omit' from source: magic vars 32134 1727204462.71910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32134 1727204462.74553: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32134 1727204462.74651: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32134 1727204462.74703: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32134 1727204462.74755: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32134 1727204462.74792: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32134 1727204462.74902: variable 'network_provider' from source: set_fact 32134 1727204462.75076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32134 1727204462.75136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32134 1727204462.75181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32134 1727204462.75234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32134 1727204462.75254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32134 1727204462.75376: variable 'omit' from source: magic vars 32134 1727204462.75491: variable 'omit' from source: magic vars 32134 1727204462.75639: variable 'network_connections' from source: play vars 32134 1727204462.75661: variable 'profile' from source: play vars 32134 1727204462.75758: variable 'profile' from source: play vars 32134 1727204462.75770: variable 'interface' from source: set_fact 32134 1727204462.76094: variable 'interface' from source: set_fact 32134 1727204462.76097: variable 'omit' from source: magic vars 32134 1727204462.76100: variable '__lsr_ansible_managed' from source: task vars 32134 1727204462.76125: variable '__lsr_ansible_managed' from source: task vars 32134 1727204462.76508: Loaded config def from plugin (lookup/template) 32134 1727204462.76521: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 32134 1727204462.76566: File lookup term: get_ansible_managed.j2 32134 1727204462.76575: variable 'ansible_search_path' from source: unknown 32134 1727204462.76586: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 32134 1727204462.76611: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 32134 1727204462.76637: variable 'ansible_search_path' from source: unknown 32134 1727204462.86886: variable 'ansible_managed' from source: unknown 32134 1727204462.87149: variable 'omit' from source: magic vars 32134 1727204462.87192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204462.87236: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204462.87261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204462.87286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204462.87304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204462.87345: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204462.87355: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204462.87364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204462.87491: Set connection var ansible_timeout to 10 32134 1727204462.87516: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204462.87524: Set connection var ansible_connection to ssh 32134 1727204462.87531: Set connection var ansible_shell_type to sh 32134 1727204462.87547: Set connection var ansible_shell_executable to /bin/sh 32134 1727204462.87558: Set connection var ansible_pipelining to False 32134 1727204462.87588: variable 'ansible_shell_executable' from source: unknown 32134 1727204462.87598: variable 'ansible_connection' from source: unknown 32134 1727204462.87605: variable 'ansible_module_compression' from source: unknown 32134 1727204462.87612: variable 'ansible_shell_type' from source: unknown 32134 1727204462.87621: variable 'ansible_shell_executable' from source: unknown 32134 1727204462.87629: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204462.87637: variable 'ansible_pipelining' from source: unknown 32134 1727204462.87648: variable 'ansible_timeout' from source: unknown 32134 1727204462.87657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204462.87824: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204462.87850: variable 'omit' from source: magic vars 32134 1727204462.87867: starting attempt loop 32134 1727204462.87874: running the handler 32134 1727204462.87895: _low_level_execute_command(): starting 32134 1727204462.87907: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204462.88709: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204462.88760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204462.88792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204462.88808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204462.88879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204462.90718: stdout chunk (state=3): >>>/root <<< 32134 1727204462.90825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204462.90930: stderr chunk (state=3): >>><<< 32134 1727204462.90934: stdout chunk (state=3): >>><<< 32134 1727204462.91035: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204462.91038: _low_level_execute_command(): starting 32134 1727204462.91042: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204462.9097037-33812-136270711718061 `" && echo ansible-tmp-1727204462.9097037-33812-136270711718061="` echo /root/.ansible/tmp/ansible-tmp-1727204462.9097037-33812-136270711718061 `" ) && sleep 0' 32134 1727204462.91710: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204462.91747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204462.91801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204462.91808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204462.91859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204462.94013: stdout chunk (state=3): >>>ansible-tmp-1727204462.9097037-33812-136270711718061=/root/.ansible/tmp/ansible-tmp-1727204462.9097037-33812-136270711718061 <<< 32134 1727204462.94233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204462.94236: stdout chunk (state=3): >>><<< 32134 1727204462.94239: stderr chunk (state=3): >>><<< 32134 1727204462.94273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204462.9097037-33812-136270711718061=/root/.ansible/tmp/ansible-tmp-1727204462.9097037-33812-136270711718061 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204462.94363: variable 'ansible_module_compression' from source: unknown 32134 1727204462.94408: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 32134 1727204462.94437: variable 'ansible_facts' from source: unknown 32134 1727204462.94515: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204462.9097037-33812-136270711718061/AnsiballZ_network_connections.py 32134 1727204462.94630: Sending initial data 32134 1727204462.94633: Sent initial data (168 bytes) 32134 1727204462.95051: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204462.95084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204462.95087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204462.95093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204462.95096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204462.95102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204462.95159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204462.95162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204462.95204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204462.96958: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204462.96993: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204462.97062: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpfop70yb3 /root/.ansible/tmp/ansible-tmp-1727204462.9097037-33812-136270711718061/AnsiballZ_network_connections.py <<< 32134 1727204462.97066: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204462.9097037-33812-136270711718061/AnsiballZ_network_connections.py" <<< 32134 1727204462.97106: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpfop70yb3" to remote "/root/.ansible/tmp/ansible-tmp-1727204462.9097037-33812-136270711718061/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204462.9097037-33812-136270711718061/AnsiballZ_network_connections.py" <<< 32134 1727204462.98370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204462.98453: stderr chunk (state=3): >>><<< 32134 1727204462.98493: stdout chunk (state=3): >>><<< 32134 1727204462.98499: done transferring module to remote 32134 1727204462.98501: _low_level_execute_command(): starting 32134 1727204462.98504: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204462.9097037-33812-136270711718061/ /root/.ansible/tmp/ansible-tmp-1727204462.9097037-33812-136270711718061/AnsiballZ_network_connections.py && sleep 0' 32134 1727204462.98964: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204462.98968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204462.98971: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204462.98973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204462.99033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204462.99041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204462.99043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204462.99080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204463.01054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204463.01102: stderr chunk (state=3): >>><<< 32134 1727204463.01106: stdout chunk (state=3): >>><<< 32134 1727204463.01124: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204463.01127: _low_level_execute_command(): starting 32134 1727204463.01132: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204462.9097037-33812-136270711718061/AnsiballZ_network_connections.py && sleep 0' 32134 1727204463.01602: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204463.01606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204463.01609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204463.01611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204463.01657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204463.01661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204463.01717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204463.34320: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hem4yjux/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hem4yjux/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/2e4ab50e-0a87-42ab-af52-2be8774b7af4: error=unknown <<< 32134 1727204463.34527: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 32134 1727204463.36673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204463.36730: stderr chunk (state=3): >>><<< 32134 1727204463.36733: stdout chunk (state=3): >>><<< 32134 1727204463.36752: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hem4yjux/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hem4yjux/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/2e4ab50e-0a87-42ab-af52-2be8774b7af4: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204463.36794: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204462.9097037-33812-136270711718061/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204463.36806: _low_level_execute_command(): starting 32134 1727204463.36811: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204462.9097037-33812-136270711718061/ > /dev/null 2>&1 && sleep 0' 32134 1727204463.37296: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204463.37299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204463.37304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204463.37306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204463.37308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204463.37351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204463.37365: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204463.37411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204463.39358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204463.39484: stderr chunk (state=3): >>><<< 32134 1727204463.39488: stdout chunk (state=3): >>><<< 32134 1727204463.39493: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204463.39495: handler run complete 32134 1727204463.39497: attempt loop complete, returning result 32134 1727204463.39500: _execute() done 32134 1727204463.39502: dumping result to json 32134 1727204463.39504: done dumping result, returning 32134 1727204463.39506: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-753f-5162-000000000094] 32134 1727204463.39515: sending task result for task 12b410aa-8751-753f-5162-000000000094 32134 1727204463.39594: done sending task result for task 12b410aa-8751-753f-5162-000000000094 changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 32134 1727204463.39722: no more pending results, returning what we have 32134 1727204463.39726: results queue empty 32134 1727204463.39726: checking for any_errors_fatal 32134 1727204463.39735: done checking for any_errors_fatal 32134 1727204463.39736: checking for max_fail_percentage 32134 1727204463.39738: done checking for max_fail_percentage 32134 1727204463.39740: checking to see if all hosts have failed and the running result is not ok 32134 1727204463.39741: done checking to see if all hosts have failed 32134 1727204463.39742: getting the remaining hosts for this loop 32134 1727204463.39743: done getting the remaining hosts for this loop 32134 1727204463.39747: getting the next task for host managed-node2 32134 1727204463.39753: done getting next task for host managed-node2 32134 1727204463.39756: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 32134 1727204463.39758: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204463.39768: getting variables 32134 1727204463.39769: in VariableManager get_vars() 32134 1727204463.39817: Calling all_inventory to load vars for managed-node2 32134 1727204463.39821: Calling groups_inventory to load vars for managed-node2 32134 1727204463.39824: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204463.39830: WORKER PROCESS EXITING 32134 1727204463.39841: Calling all_plugins_play to load vars for managed-node2 32134 1727204463.39844: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204463.39848: Calling groups_plugins_play to load vars for managed-node2 32134 1727204463.41685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204463.43935: done with get_vars() 32134 1727204463.43972: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:01:03 -0400 (0:00:00.739) 0:00:37.844 ***** 32134 1727204463.44049: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 32134 1727204463.44332: worker is 1 (out of 1 available) 32134 1727204463.44352: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 32134 1727204463.44365: done queuing things up, now waiting for results queue to drain 32134 1727204463.44367: waiting for pending results... 32134 1727204463.44566: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 32134 1727204463.44651: in run() - task 12b410aa-8751-753f-5162-000000000095 32134 1727204463.44665: variable 'ansible_search_path' from source: unknown 32134 1727204463.44669: variable 'ansible_search_path' from source: unknown 32134 1727204463.44707: calling self._execute() 32134 1727204463.44895: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204463.44900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204463.44902: variable 'omit' from source: magic vars 32134 1727204463.45342: variable 'ansible_distribution_major_version' from source: facts 32134 1727204463.45361: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204463.45547: variable 'network_state' from source: role '' defaults 32134 1727204463.45567: Evaluated conditional (network_state != {}): False 32134 1727204463.45575: when evaluation is False, skipping this task 32134 1727204463.45584: _execute() done 32134 1727204463.45596: dumping result to json 32134 1727204463.45605: done dumping result, returning 32134 1727204463.45623: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-753f-5162-000000000095] 32134 1727204463.45636: sending task result for task 12b410aa-8751-753f-5162-000000000095 32134 1727204463.45837: done sending task result for task 12b410aa-8751-753f-5162-000000000095 32134 1727204463.45841: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32134 1727204463.45928: no more pending results, returning what we have 32134 1727204463.45933: results queue empty 32134 1727204463.45934: checking for any_errors_fatal 32134 1727204463.45951: done checking for any_errors_fatal 32134 1727204463.45952: checking for max_fail_percentage 32134 1727204463.45954: done checking for max_fail_percentage 32134 1727204463.45955: checking to see if all hosts have failed and the running result is not ok 32134 1727204463.45956: done checking to see if all hosts have failed 32134 1727204463.45957: getting the remaining hosts for this loop 32134 1727204463.45958: done getting the remaining hosts for this loop 32134 1727204463.45963: getting the next task for host managed-node2 32134 1727204463.45970: done getting next task for host managed-node2 32134 1727204463.45974: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 32134 1727204463.45978: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204463.46003: getting variables 32134 1727204463.46006: in VariableManager get_vars() 32134 1727204463.46050: Calling all_inventory to load vars for managed-node2 32134 1727204463.46054: Calling groups_inventory to load vars for managed-node2 32134 1727204463.46056: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204463.46069: Calling all_plugins_play to load vars for managed-node2 32134 1727204463.46073: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204463.46076: Calling groups_plugins_play to load vars for managed-node2 32134 1727204463.47554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204463.49177: done with get_vars() 32134 1727204463.49209: done getting variables 32134 1727204463.49266: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:01:03 -0400 (0:00:00.052) 0:00:37.896 ***** 32134 1727204463.49295: entering _queue_task() for managed-node2/debug 32134 1727204463.49579: worker is 1 (out of 1 available) 32134 1727204463.49596: exiting _queue_task() for managed-node2/debug 32134 1727204463.49613: done queuing things up, now waiting for results queue to drain 32134 1727204463.49615: waiting for pending results... 32134 1727204463.49806: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 32134 1727204463.49894: in run() - task 12b410aa-8751-753f-5162-000000000096 32134 1727204463.49909: variable 'ansible_search_path' from source: unknown 32134 1727204463.49915: variable 'ansible_search_path' from source: unknown 32134 1727204463.49955: calling self._execute() 32134 1727204463.50042: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204463.50051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204463.50065: variable 'omit' from source: magic vars 32134 1727204463.50402: variable 'ansible_distribution_major_version' from source: facts 32134 1727204463.50416: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204463.50419: variable 'omit' from source: magic vars 32134 1727204463.50458: variable 'omit' from source: magic vars 32134 1727204463.50488: variable 'omit' from source: magic vars 32134 1727204463.50530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204463.50560: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204463.50579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204463.50597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204463.50614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204463.50642: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204463.50645: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204463.50650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204463.50737: Set connection var ansible_timeout to 10 32134 1727204463.50752: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204463.50755: Set connection var ansible_connection to ssh 32134 1727204463.50758: Set connection var ansible_shell_type to sh 32134 1727204463.50764: Set connection var ansible_shell_executable to /bin/sh 32134 1727204463.50771: Set connection var ansible_pipelining to False 32134 1727204463.50791: variable 'ansible_shell_executable' from source: unknown 32134 1727204463.50794: variable 'ansible_connection' from source: unknown 32134 1727204463.50797: variable 'ansible_module_compression' from source: unknown 32134 1727204463.50802: variable 'ansible_shell_type' from source: unknown 32134 1727204463.50804: variable 'ansible_shell_executable' from source: unknown 32134 1727204463.50809: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204463.50814: variable 'ansible_pipelining' from source: unknown 32134 1727204463.50817: variable 'ansible_timeout' from source: unknown 32134 1727204463.50828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204463.50944: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204463.50957: variable 'omit' from source: magic vars 32134 1727204463.50962: starting attempt loop 32134 1727204463.50966: running the handler 32134 1727204463.51079: variable '__network_connections_result' from source: set_fact 32134 1727204463.51130: handler run complete 32134 1727204463.51147: attempt loop complete, returning result 32134 1727204463.51152: _execute() done 32134 1727204463.51155: dumping result to json 32134 1727204463.51158: done dumping result, returning 32134 1727204463.51171: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-753f-5162-000000000096] 32134 1727204463.51174: sending task result for task 12b410aa-8751-753f-5162-000000000096 32134 1727204463.51268: done sending task result for task 12b410aa-8751-753f-5162-000000000096 32134 1727204463.51271: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 32134 1727204463.51341: no more pending results, returning what we have 32134 1727204463.51345: results queue empty 32134 1727204463.51346: checking for any_errors_fatal 32134 1727204463.51357: done checking for any_errors_fatal 32134 1727204463.51358: checking for max_fail_percentage 32134 1727204463.51360: done checking for max_fail_percentage 32134 1727204463.51361: checking to see if all hosts have failed and the running result is not ok 32134 1727204463.51363: done checking to see if all hosts have failed 32134 1727204463.51363: getting the remaining hosts for this loop 32134 1727204463.51365: done getting the remaining hosts for this loop 32134 1727204463.51369: getting the next task for host managed-node2 32134 1727204463.51375: done getting next task for host managed-node2 32134 1727204463.51379: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 32134 1727204463.51382: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204463.51401: getting variables 32134 1727204463.51403: in VariableManager get_vars() 32134 1727204463.51444: Calling all_inventory to load vars for managed-node2 32134 1727204463.51447: Calling groups_inventory to load vars for managed-node2 32134 1727204463.51450: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204463.51460: Calling all_plugins_play to load vars for managed-node2 32134 1727204463.51463: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204463.51466: Calling groups_plugins_play to load vars for managed-node2 32134 1727204463.52738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204463.54439: done with get_vars() 32134 1727204463.54465: done getting variables 32134 1727204463.54518: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:01:03 -0400 (0:00:00.052) 0:00:37.949 ***** 32134 1727204463.54544: entering _queue_task() for managed-node2/debug 32134 1727204463.54820: worker is 1 (out of 1 available) 32134 1727204463.54836: exiting _queue_task() for managed-node2/debug 32134 1727204463.54850: done queuing things up, now waiting for results queue to drain 32134 1727204463.54852: waiting for pending results... 32134 1727204463.55054: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 32134 1727204463.55137: in run() - task 12b410aa-8751-753f-5162-000000000097 32134 1727204463.55151: variable 'ansible_search_path' from source: unknown 32134 1727204463.55155: variable 'ansible_search_path' from source: unknown 32134 1727204463.55191: calling self._execute() 32134 1727204463.55284: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204463.55288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204463.55306: variable 'omit' from source: magic vars 32134 1727204463.55634: variable 'ansible_distribution_major_version' from source: facts 32134 1727204463.55646: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204463.55653: variable 'omit' from source: magic vars 32134 1727204463.55691: variable 'omit' from source: magic vars 32134 1727204463.55722: variable 'omit' from source: magic vars 32134 1727204463.55761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204463.55792: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204463.55814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204463.55829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204463.55846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204463.55874: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204463.55879: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204463.55884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204463.55972: Set connection var ansible_timeout to 10 32134 1727204463.55985: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204463.55988: Set connection var ansible_connection to ssh 32134 1727204463.55992: Set connection var ansible_shell_type to sh 32134 1727204463.56000: Set connection var ansible_shell_executable to /bin/sh 32134 1727204463.56006: Set connection var ansible_pipelining to False 32134 1727204463.56027: variable 'ansible_shell_executable' from source: unknown 32134 1727204463.56030: variable 'ansible_connection' from source: unknown 32134 1727204463.56035: variable 'ansible_module_compression' from source: unknown 32134 1727204463.56037: variable 'ansible_shell_type' from source: unknown 32134 1727204463.56042: variable 'ansible_shell_executable' from source: unknown 32134 1727204463.56045: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204463.56050: variable 'ansible_pipelining' from source: unknown 32134 1727204463.56059: variable 'ansible_timeout' from source: unknown 32134 1727204463.56063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204463.56186: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204463.56195: variable 'omit' from source: magic vars 32134 1727204463.56202: starting attempt loop 32134 1727204463.56205: running the handler 32134 1727204463.56248: variable '__network_connections_result' from source: set_fact 32134 1727204463.56323: variable '__network_connections_result' from source: set_fact 32134 1727204463.56423: handler run complete 32134 1727204463.56446: attempt loop complete, returning result 32134 1727204463.56449: _execute() done 32134 1727204463.56452: dumping result to json 32134 1727204463.56458: done dumping result, returning 32134 1727204463.56466: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-753f-5162-000000000097] 32134 1727204463.56472: sending task result for task 12b410aa-8751-753f-5162-000000000097 32134 1727204463.56574: done sending task result for task 12b410aa-8751-753f-5162-000000000097 32134 1727204463.56577: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 32134 1727204463.56676: no more pending results, returning what we have 32134 1727204463.56679: results queue empty 32134 1727204463.56680: checking for any_errors_fatal 32134 1727204463.56691: done checking for any_errors_fatal 32134 1727204463.56692: checking for max_fail_percentage 32134 1727204463.56694: done checking for max_fail_percentage 32134 1727204463.56695: checking to see if all hosts have failed and the running result is not ok 32134 1727204463.56696: done checking to see if all hosts have failed 32134 1727204463.56698: getting the remaining hosts for this loop 32134 1727204463.56700: done getting the remaining hosts for this loop 32134 1727204463.56704: getting the next task for host managed-node2 32134 1727204463.56710: done getting next task for host managed-node2 32134 1727204463.56716: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 32134 1727204463.56719: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204463.56729: getting variables 32134 1727204463.56730: in VariableManager get_vars() 32134 1727204463.56768: Calling all_inventory to load vars for managed-node2 32134 1727204463.56771: Calling groups_inventory to load vars for managed-node2 32134 1727204463.56773: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204463.56783: Calling all_plugins_play to load vars for managed-node2 32134 1727204463.56786: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204463.56797: Calling groups_plugins_play to load vars for managed-node2 32134 1727204463.58056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204463.59671: done with get_vars() 32134 1727204463.59701: done getting variables 32134 1727204463.59757: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:01:03 -0400 (0:00:00.052) 0:00:38.001 ***** 32134 1727204463.59786: entering _queue_task() for managed-node2/debug 32134 1727204463.60067: worker is 1 (out of 1 available) 32134 1727204463.60083: exiting _queue_task() for managed-node2/debug 32134 1727204463.60098: done queuing things up, now waiting for results queue to drain 32134 1727204463.60100: waiting for pending results... 32134 1727204463.60303: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 32134 1727204463.60399: in run() - task 12b410aa-8751-753f-5162-000000000098 32134 1727204463.60415: variable 'ansible_search_path' from source: unknown 32134 1727204463.60418: variable 'ansible_search_path' from source: unknown 32134 1727204463.60454: calling self._execute() 32134 1727204463.60544: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204463.60548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204463.60560: variable 'omit' from source: magic vars 32134 1727204463.60897: variable 'ansible_distribution_major_version' from source: facts 32134 1727204463.60909: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204463.61018: variable 'network_state' from source: role '' defaults 32134 1727204463.61028: Evaluated conditional (network_state != {}): False 32134 1727204463.61031: when evaluation is False, skipping this task 32134 1727204463.61034: _execute() done 32134 1727204463.61039: dumping result to json 32134 1727204463.61044: done dumping result, returning 32134 1727204463.61052: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-753f-5162-000000000098] 32134 1727204463.61058: sending task result for task 12b410aa-8751-753f-5162-000000000098 32134 1727204463.61156: done sending task result for task 12b410aa-8751-753f-5162-000000000098 32134 1727204463.61159: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 32134 1727204463.61218: no more pending results, returning what we have 32134 1727204463.61222: results queue empty 32134 1727204463.61223: checking for any_errors_fatal 32134 1727204463.61235: done checking for any_errors_fatal 32134 1727204463.61236: checking for max_fail_percentage 32134 1727204463.61237: done checking for max_fail_percentage 32134 1727204463.61238: checking to see if all hosts have failed and the running result is not ok 32134 1727204463.61239: done checking to see if all hosts have failed 32134 1727204463.61240: getting the remaining hosts for this loop 32134 1727204463.61242: done getting the remaining hosts for this loop 32134 1727204463.61246: getting the next task for host managed-node2 32134 1727204463.61252: done getting next task for host managed-node2 32134 1727204463.61256: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 32134 1727204463.61259: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204463.61275: getting variables 32134 1727204463.61277: in VariableManager get_vars() 32134 1727204463.61319: Calling all_inventory to load vars for managed-node2 32134 1727204463.61322: Calling groups_inventory to load vars for managed-node2 32134 1727204463.61325: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204463.61335: Calling all_plugins_play to load vars for managed-node2 32134 1727204463.61338: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204463.61342: Calling groups_plugins_play to load vars for managed-node2 32134 1727204463.62744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204463.64335: done with get_vars() 32134 1727204463.64358: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:01:03 -0400 (0:00:00.046) 0:00:38.048 ***** 32134 1727204463.64439: entering _queue_task() for managed-node2/ping 32134 1727204463.64697: worker is 1 (out of 1 available) 32134 1727204463.64716: exiting _queue_task() for managed-node2/ping 32134 1727204463.64729: done queuing things up, now waiting for results queue to drain 32134 1727204463.64731: waiting for pending results... 32134 1727204463.64924: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 32134 1727204463.65006: in run() - task 12b410aa-8751-753f-5162-000000000099 32134 1727204463.65019: variable 'ansible_search_path' from source: unknown 32134 1727204463.65023: variable 'ansible_search_path' from source: unknown 32134 1727204463.65055: calling self._execute() 32134 1727204463.65156: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204463.65162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204463.65179: variable 'omit' from source: magic vars 32134 1727204463.65495: variable 'ansible_distribution_major_version' from source: facts 32134 1727204463.65507: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204463.65522: variable 'omit' from source: magic vars 32134 1727204463.65551: variable 'omit' from source: magic vars 32134 1727204463.65580: variable 'omit' from source: magic vars 32134 1727204463.65617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204463.65650: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204463.65669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204463.65685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204463.65700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204463.65729: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204463.65735: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204463.65737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204463.65824: Set connection var ansible_timeout to 10 32134 1727204463.65838: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204463.65841: Set connection var ansible_connection to ssh 32134 1727204463.65843: Set connection var ansible_shell_type to sh 32134 1727204463.65855: Set connection var ansible_shell_executable to /bin/sh 32134 1727204463.65860: Set connection var ansible_pipelining to False 32134 1727204463.65881: variable 'ansible_shell_executable' from source: unknown 32134 1727204463.65884: variable 'ansible_connection' from source: unknown 32134 1727204463.65887: variable 'ansible_module_compression' from source: unknown 32134 1727204463.65893: variable 'ansible_shell_type' from source: unknown 32134 1727204463.65896: variable 'ansible_shell_executable' from source: unknown 32134 1727204463.65901: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204463.65905: variable 'ansible_pipelining' from source: unknown 32134 1727204463.65909: variable 'ansible_timeout' from source: unknown 32134 1727204463.65915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204463.66087: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204463.66098: variable 'omit' from source: magic vars 32134 1727204463.66104: starting attempt loop 32134 1727204463.66107: running the handler 32134 1727204463.66122: _low_level_execute_command(): starting 32134 1727204463.66129: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204463.66683: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204463.66687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204463.66693: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204463.66696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204463.66759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204463.66763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204463.66855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204463.68583: stdout chunk (state=3): >>>/root <<< 32134 1727204463.68694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204463.68761: stderr chunk (state=3): >>><<< 32134 1727204463.68763: stdout chunk (state=3): >>><<< 32134 1727204463.68779: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204463.68815: _low_level_execute_command(): starting 32134 1727204463.68820: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204463.6878567-33847-155117656689104 `" && echo ansible-tmp-1727204463.6878567-33847-155117656689104="` echo /root/.ansible/tmp/ansible-tmp-1727204463.6878567-33847-155117656689104 `" ) && sleep 0' 32134 1727204463.69255: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204463.69294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204463.69297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204463.69300: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204463.69309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204463.69354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204463.69360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204463.69409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204463.71532: stdout chunk (state=3): >>>ansible-tmp-1727204463.6878567-33847-155117656689104=/root/.ansible/tmp/ansible-tmp-1727204463.6878567-33847-155117656689104 <<< 32134 1727204463.71741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204463.71745: stdout chunk (state=3): >>><<< 32134 1727204463.71748: stderr chunk (state=3): >>><<< 32134 1727204463.71796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204463.6878567-33847-155117656689104=/root/.ansible/tmp/ansible-tmp-1727204463.6878567-33847-155117656689104 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204463.71832: variable 'ansible_module_compression' from source: unknown 32134 1727204463.71886: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 32134 1727204463.71931: variable 'ansible_facts' from source: unknown 32134 1727204463.72032: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204463.6878567-33847-155117656689104/AnsiballZ_ping.py 32134 1727204463.72299: Sending initial data 32134 1727204463.72303: Sent initial data (153 bytes) 32134 1727204463.72907: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204463.72962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204463.72979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204463.73002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204463.73079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204463.74864: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204463.74941: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204463.74986: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmp96ygzw_c /root/.ansible/tmp/ansible-tmp-1727204463.6878567-33847-155117656689104/AnsiballZ_ping.py <<< 32134 1727204463.74990: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204463.6878567-33847-155117656689104/AnsiballZ_ping.py" <<< 32134 1727204463.75037: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmp96ygzw_c" to remote "/root/.ansible/tmp/ansible-tmp-1727204463.6878567-33847-155117656689104/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204463.6878567-33847-155117656689104/AnsiballZ_ping.py" <<< 32134 1727204463.76194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204463.76539: stderr chunk (state=3): >>><<< 32134 1727204463.76542: stdout chunk (state=3): >>><<< 32134 1727204463.76545: done transferring module to remote 32134 1727204463.76547: _low_level_execute_command(): starting 32134 1727204463.76550: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204463.6878567-33847-155117656689104/ /root/.ansible/tmp/ansible-tmp-1727204463.6878567-33847-155117656689104/AnsiballZ_ping.py && sleep 0' 32134 1727204463.77141: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204463.77159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204463.77173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204463.77195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204463.77216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204463.77230: stderr chunk (state=3): >>>debug2: match not found <<< 32134 1727204463.77244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204463.77354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 32134 1727204463.77374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204463.77398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204463.77473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204463.79487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204463.79593: stderr chunk (state=3): >>><<< 32134 1727204463.79604: stdout chunk (state=3): >>><<< 32134 1727204463.79632: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204463.79643: _low_level_execute_command(): starting 32134 1727204463.79656: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204463.6878567-33847-155117656689104/AnsiballZ_ping.py && sleep 0' 32134 1727204463.80329: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204463.80356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204463.80373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204463.80458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204463.80487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204463.80507: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204463.80530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204463.80620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204463.98120: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 32134 1727204463.99629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204463.99670: stderr chunk (state=3): >>><<< 32134 1727204463.99682: stdout chunk (state=3): >>><<< 32134 1727204463.99708: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204463.99748: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204463.6878567-33847-155117656689104/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204463.99765: _low_level_execute_command(): starting 32134 1727204463.99774: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204463.6878567-33847-155117656689104/ > /dev/null 2>&1 && sleep 0' 32134 1727204464.00529: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204464.00636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204464.00651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204464.00678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204464.00754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204464.02845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204464.02848: stdout chunk (state=3): >>><<< 32134 1727204464.02851: stderr chunk (state=3): >>><<< 32134 1727204464.02872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204464.02994: handler run complete 32134 1727204464.02998: attempt loop complete, returning result 32134 1727204464.03000: _execute() done 32134 1727204464.03003: dumping result to json 32134 1727204464.03005: done dumping result, returning 32134 1727204464.03007: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-753f-5162-000000000099] 32134 1727204464.03009: sending task result for task 12b410aa-8751-753f-5162-000000000099 32134 1727204464.03086: done sending task result for task 12b410aa-8751-753f-5162-000000000099 ok: [managed-node2] => { "changed": false, "ping": "pong" } 32134 1727204464.03166: no more pending results, returning what we have 32134 1727204464.03171: results queue empty 32134 1727204464.03173: checking for any_errors_fatal 32134 1727204464.03183: done checking for any_errors_fatal 32134 1727204464.03184: checking for max_fail_percentage 32134 1727204464.03186: done checking for max_fail_percentage 32134 1727204464.03187: checking to see if all hosts have failed and the running result is not ok 32134 1727204464.03188: done checking to see if all hosts have failed 32134 1727204464.03191: getting the remaining hosts for this loop 32134 1727204464.03193: done getting the remaining hosts for this loop 32134 1727204464.03198: getting the next task for host managed-node2 32134 1727204464.03207: done getting next task for host managed-node2 32134 1727204464.03213: ^ task is: TASK: meta (role_complete) 32134 1727204464.03215: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204464.03228: getting variables 32134 1727204464.03230: in VariableManager get_vars() 32134 1727204464.03276: Calling all_inventory to load vars for managed-node2 32134 1727204464.03281: Calling groups_inventory to load vars for managed-node2 32134 1727204464.03284: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204464.03530: Calling all_plugins_play to load vars for managed-node2 32134 1727204464.03535: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204464.03540: Calling groups_plugins_play to load vars for managed-node2 32134 1727204464.04145: WORKER PROCESS EXITING 32134 1727204464.06486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204464.09646: done with get_vars() 32134 1727204464.09684: done getting variables 32134 1727204464.09781: done queuing things up, now waiting for results queue to drain 32134 1727204464.09783: results queue empty 32134 1727204464.09785: checking for any_errors_fatal 32134 1727204464.09788: done checking for any_errors_fatal 32134 1727204464.09792: checking for max_fail_percentage 32134 1727204464.09793: done checking for max_fail_percentage 32134 1727204464.09794: checking to see if all hosts have failed and the running result is not ok 32134 1727204464.09795: done checking to see if all hosts have failed 32134 1727204464.09796: getting the remaining hosts for this loop 32134 1727204464.09798: done getting the remaining hosts for this loop 32134 1727204464.09801: getting the next task for host managed-node2 32134 1727204464.09806: done getting next task for host managed-node2 32134 1727204464.09808: ^ task is: TASK: meta (flush_handlers) 32134 1727204464.09810: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204464.09813: getting variables 32134 1727204464.09815: in VariableManager get_vars() 32134 1727204464.09830: Calling all_inventory to load vars for managed-node2 32134 1727204464.09833: Calling groups_inventory to load vars for managed-node2 32134 1727204464.09836: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204464.09842: Calling all_plugins_play to load vars for managed-node2 32134 1727204464.09845: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204464.09849: Calling groups_plugins_play to load vars for managed-node2 32134 1727204464.16945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204464.19840: done with get_vars() 32134 1727204464.19881: done getting variables 32134 1727204464.20012: in VariableManager get_vars() 32134 1727204464.20029: Calling all_inventory to load vars for managed-node2 32134 1727204464.20033: Calling groups_inventory to load vars for managed-node2 32134 1727204464.20035: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204464.20042: Calling all_plugins_play to load vars for managed-node2 32134 1727204464.20045: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204464.20048: Calling groups_plugins_play to load vars for managed-node2 32134 1727204464.22316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204464.25591: done with get_vars() 32134 1727204464.25645: done queuing things up, now waiting for results queue to drain 32134 1727204464.25647: results queue empty 32134 1727204464.25648: checking for any_errors_fatal 32134 1727204464.25650: done checking for any_errors_fatal 32134 1727204464.25651: checking for max_fail_percentage 32134 1727204464.25653: done checking for max_fail_percentage 32134 1727204464.25654: checking to see if all hosts have failed and the running result is not ok 32134 1727204464.25655: done checking to see if all hosts have failed 32134 1727204464.25656: getting the remaining hosts for this loop 32134 1727204464.25657: done getting the remaining hosts for this loop 32134 1727204464.25660: getting the next task for host managed-node2 32134 1727204464.25665: done getting next task for host managed-node2 32134 1727204464.25667: ^ task is: TASK: meta (flush_handlers) 32134 1727204464.25669: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204464.25672: getting variables 32134 1727204464.25674: in VariableManager get_vars() 32134 1727204464.25688: Calling all_inventory to load vars for managed-node2 32134 1727204464.25693: Calling groups_inventory to load vars for managed-node2 32134 1727204464.25696: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204464.25703: Calling all_plugins_play to load vars for managed-node2 32134 1727204464.25706: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204464.25710: Calling groups_plugins_play to load vars for managed-node2 32134 1727204464.27753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204464.31020: done with get_vars() 32134 1727204464.31062: done getting variables 32134 1727204464.31146: in VariableManager get_vars() 32134 1727204464.31162: Calling all_inventory to load vars for managed-node2 32134 1727204464.31165: Calling groups_inventory to load vars for managed-node2 32134 1727204464.31168: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204464.31179: Calling all_plugins_play to load vars for managed-node2 32134 1727204464.31183: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204464.31187: Calling groups_plugins_play to load vars for managed-node2 32134 1727204464.33581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204464.36057: done with get_vars() 32134 1727204464.36092: done queuing things up, now waiting for results queue to drain 32134 1727204464.36095: results queue empty 32134 1727204464.36095: checking for any_errors_fatal 32134 1727204464.36097: done checking for any_errors_fatal 32134 1727204464.36097: checking for max_fail_percentage 32134 1727204464.36098: done checking for max_fail_percentage 32134 1727204464.36099: checking to see if all hosts have failed and the running result is not ok 32134 1727204464.36099: done checking to see if all hosts have failed 32134 1727204464.36100: getting the remaining hosts for this loop 32134 1727204464.36101: done getting the remaining hosts for this loop 32134 1727204464.36103: getting the next task for host managed-node2 32134 1727204464.36106: done getting next task for host managed-node2 32134 1727204464.36107: ^ task is: None 32134 1727204464.36108: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204464.36109: done queuing things up, now waiting for results queue to drain 32134 1727204464.36110: results queue empty 32134 1727204464.36112: checking for any_errors_fatal 32134 1727204464.36113: done checking for any_errors_fatal 32134 1727204464.36113: checking for max_fail_percentage 32134 1727204464.36114: done checking for max_fail_percentage 32134 1727204464.36115: checking to see if all hosts have failed and the running result is not ok 32134 1727204464.36115: done checking to see if all hosts have failed 32134 1727204464.36116: getting the next task for host managed-node2 32134 1727204464.36118: done getting next task for host managed-node2 32134 1727204464.36118: ^ task is: None 32134 1727204464.36119: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204464.36149: in VariableManager get_vars() 32134 1727204464.36163: done with get_vars() 32134 1727204464.36169: in VariableManager get_vars() 32134 1727204464.36176: done with get_vars() 32134 1727204464.36179: variable 'omit' from source: magic vars 32134 1727204464.36204: in VariableManager get_vars() 32134 1727204464.36213: done with get_vars() 32134 1727204464.36229: variable 'omit' from source: magic vars PLAY [Delete the interface, then assert that device and profile are absent] **** 32134 1727204464.36423: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 32134 1727204464.36443: getting the remaining hosts for this loop 32134 1727204464.36444: done getting the remaining hosts for this loop 32134 1727204464.36447: getting the next task for host managed-node2 32134 1727204464.36449: done getting next task for host managed-node2 32134 1727204464.36450: ^ task is: TASK: Gathering Facts 32134 1727204464.36452: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204464.36453: getting variables 32134 1727204464.36454: in VariableManager get_vars() 32134 1727204464.36461: Calling all_inventory to load vars for managed-node2 32134 1727204464.36463: Calling groups_inventory to load vars for managed-node2 32134 1727204464.36465: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204464.36470: Calling all_plugins_play to load vars for managed-node2 32134 1727204464.36472: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204464.36474: Calling groups_plugins_play to load vars for managed-node2 32134 1727204464.38246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204464.39843: done with get_vars() 32134 1727204464.39869: done getting variables 32134 1727204464.39919: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:80 Tuesday 24 September 2024 15:01:04 -0400 (0:00:00.755) 0:00:38.803 ***** 32134 1727204464.39940: entering _queue_task() for managed-node2/gather_facts 32134 1727204464.40274: worker is 1 (out of 1 available) 32134 1727204464.40287: exiting _queue_task() for managed-node2/gather_facts 32134 1727204464.40302: done queuing things up, now waiting for results queue to drain 32134 1727204464.40304: waiting for pending results... 32134 1727204464.40713: running TaskExecutor() for managed-node2/TASK: Gathering Facts 32134 1727204464.40771: in run() - task 12b410aa-8751-753f-5162-0000000005ee 32134 1727204464.40799: variable 'ansible_search_path' from source: unknown 32134 1727204464.40852: calling self._execute() 32134 1727204464.40980: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204464.41005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204464.41094: variable 'omit' from source: magic vars 32134 1727204464.41545: variable 'ansible_distribution_major_version' from source: facts 32134 1727204464.41564: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204464.41579: variable 'omit' from source: magic vars 32134 1727204464.41627: variable 'omit' from source: magic vars 32134 1727204464.41682: variable 'omit' from source: magic vars 32134 1727204464.41741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204464.41771: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204464.41793: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204464.41812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204464.41828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204464.41868: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204464.41872: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204464.41876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204464.41969: Set connection var ansible_timeout to 10 32134 1727204464.41982: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204464.41985: Set connection var ansible_connection to ssh 32134 1727204464.41988: Set connection var ansible_shell_type to sh 32134 1727204464.41997: Set connection var ansible_shell_executable to /bin/sh 32134 1727204464.42003: Set connection var ansible_pipelining to False 32134 1727204464.42026: variable 'ansible_shell_executable' from source: unknown 32134 1727204464.42029: variable 'ansible_connection' from source: unknown 32134 1727204464.42032: variable 'ansible_module_compression' from source: unknown 32134 1727204464.42043: variable 'ansible_shell_type' from source: unknown 32134 1727204464.42045: variable 'ansible_shell_executable' from source: unknown 32134 1727204464.42048: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204464.42051: variable 'ansible_pipelining' from source: unknown 32134 1727204464.42053: variable 'ansible_timeout' from source: unknown 32134 1727204464.42059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204464.42219: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204464.42230: variable 'omit' from source: magic vars 32134 1727204464.42239: starting attempt loop 32134 1727204464.42243: running the handler 32134 1727204464.42256: variable 'ansible_facts' from source: unknown 32134 1727204464.42280: _low_level_execute_command(): starting 32134 1727204464.42288: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204464.42841: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204464.42847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204464.42851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204464.42895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204464.42913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204464.42959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204464.44776: stdout chunk (state=3): >>>/root <<< 32134 1727204464.45007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204464.45015: stdout chunk (state=3): >>><<< 32134 1727204464.45018: stderr chunk (state=3): >>><<< 32134 1727204464.45049: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204464.45066: _low_level_execute_command(): starting 32134 1727204464.45073: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204464.450502-33872-212922780136985 `" && echo ansible-tmp-1727204464.450502-33872-212922780136985="` echo /root/.ansible/tmp/ansible-tmp-1727204464.450502-33872-212922780136985 `" ) && sleep 0' 32134 1727204464.45555: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204464.45560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204464.45564: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204464.45573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204464.45622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204464.45626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204464.45675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204464.47767: stdout chunk (state=3): >>>ansible-tmp-1727204464.450502-33872-212922780136985=/root/.ansible/tmp/ansible-tmp-1727204464.450502-33872-212922780136985 <<< 32134 1727204464.47884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204464.47940: stderr chunk (state=3): >>><<< 32134 1727204464.47945: stdout chunk (state=3): >>><<< 32134 1727204464.47965: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204464.450502-33872-212922780136985=/root/.ansible/tmp/ansible-tmp-1727204464.450502-33872-212922780136985 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204464.48000: variable 'ansible_module_compression' from source: unknown 32134 1727204464.48047: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 32134 1727204464.48105: variable 'ansible_facts' from source: unknown 32134 1727204464.48230: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204464.450502-33872-212922780136985/AnsiballZ_setup.py 32134 1727204464.48363: Sending initial data 32134 1727204464.48366: Sent initial data (153 bytes) 32134 1727204464.48843: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204464.48847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204464.48849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204464.48852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204464.48854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204464.48909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204464.48913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204464.48960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204464.50640: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204464.50676: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204464.50714: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpmk9bh5t1 /root/.ansible/tmp/ansible-tmp-1727204464.450502-33872-212922780136985/AnsiballZ_setup.py <<< 32134 1727204464.50722: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204464.450502-33872-212922780136985/AnsiballZ_setup.py" <<< 32134 1727204464.50764: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpmk9bh5t1" to remote "/root/.ansible/tmp/ansible-tmp-1727204464.450502-33872-212922780136985/AnsiballZ_setup.py" <<< 32134 1727204464.50770: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204464.450502-33872-212922780136985/AnsiballZ_setup.py" <<< 32134 1727204464.52408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204464.52486: stderr chunk (state=3): >>><<< 32134 1727204464.52491: stdout chunk (state=3): >>><<< 32134 1727204464.52518: done transferring module to remote 32134 1727204464.52529: _low_level_execute_command(): starting 32134 1727204464.52536: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204464.450502-33872-212922780136985/ /root/.ansible/tmp/ansible-tmp-1727204464.450502-33872-212922780136985/AnsiballZ_setup.py && sleep 0' 32134 1727204464.53024: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204464.53028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204464.53031: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204464.53033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204464.53080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204464.53103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204464.53131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204464.55101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204464.55156: stderr chunk (state=3): >>><<< 32134 1727204464.55160: stdout chunk (state=3): >>><<< 32134 1727204464.55180: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204464.55184: _low_level_execute_command(): starting 32134 1727204464.55190: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204464.450502-33872-212922780136985/AnsiballZ_setup.py && sleep 0' 32134 1727204464.55671: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204464.55677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204464.55679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 32134 1727204464.55682: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204464.55684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204464.55736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204464.55740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204464.55794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204465.28516: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.6435546875, "5m": 0.68115234375, "15m": 0.46923828125}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2840, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 877, "free": 2840}, "nocache": {"free": 3478, "used": 239}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "<<< 32134 1727204465.28545: stdout chunk (state=3): >>>ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 968, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251144593408, "block_size": 4096, "block_total": 64479564, "block_available": 61314598, "block_used": 3164966, "inode_total": 16384000, "inode_available": 16302236, "inode_used": 81764, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["peerethtest0", "eth0", "ethtest0", "lo"], "ansible_ethtest0": {"device": "ethtest0", "macaddress": "56:10:7a:3f:31:3a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::928c:aece:b50f:aeb4", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ea:ec:31:9c:5f:fe", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e8ec:31ff:fe9c:5ffe", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filt<<< 32134 1727204465.28587: stdout chunk (state=3): >>>ers": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::928c:aece:b50f:aeb4", "fe80::4a44:1e77:128f:34e8", "fe80::e8ec:31ff:fe9c:5ffe"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8", "fe80::928c:aece:b50f:aeb4", "fe80::e8ec:31ff:fe9c:5ffe"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_lsb": {}, "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "", "ansible_pkg_mgr": "dnf", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "05", "epoch": "1727204465", "epoch_int": "1727204465", "date": "2024-09-24", "time": "15:01:05", "iso8601_micro": "2024-09-24T19:01:05.280562Z", "iso8601": "2024-09-24T19:01:05Z", "iso8601_basic": "20240924T150105280562", "iso8601_basic_short": "20240924T150105", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 32134 1727204465.30719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204465.30780: stderr chunk (state=3): >>><<< 32134 1727204465.30783: stdout chunk (state=3): >>><<< 32134 1727204465.30829: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.6435546875, "5m": 0.68115234375, "15m": 0.46923828125}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2840, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 877, "free": 2840}, "nocache": {"free": 3478, "used": 239}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 968, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251144593408, "block_size": 4096, "block_total": 64479564, "block_available": 61314598, "block_used": 3164966, "inode_total": 16384000, "inode_available": 16302236, "inode_used": 81764, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["peerethtest0", "eth0", "ethtest0", "lo"], "ansible_ethtest0": {"device": "ethtest0", "macaddress": "56:10:7a:3f:31:3a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::928c:aece:b50f:aeb4", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ea:ec:31:9c:5f:fe", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e8ec:31ff:fe9c:5ffe", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::928c:aece:b50f:aeb4", "fe80::4a44:1e77:128f:34e8", "fe80::e8ec:31ff:fe9c:5ffe"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8", "fe80::928c:aece:b50f:aeb4", "fe80::e8ec:31ff:fe9c:5ffe"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_lsb": {}, "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "", "ansible_pkg_mgr": "dnf", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "05", "epoch": "1727204465", "epoch_int": "1727204465", "date": "2024-09-24", "time": "15:01:05", "iso8601_micro": "2024-09-24T19:01:05.280562Z", "iso8601": "2024-09-24T19:01:05Z", "iso8601_basic": "20240924T150105280562", "iso8601_basic_short": "20240924T150105", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204465.31264: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204464.450502-33872-212922780136985/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204465.31284: _low_level_execute_command(): starting 32134 1727204465.31289: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204464.450502-33872-212922780136985/ > /dev/null 2>&1 && sleep 0' 32134 1727204465.31777: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204465.31780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204465.31783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204465.31785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204465.31787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204465.31838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204465.31861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204465.31896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204465.33908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204465.33962: stderr chunk (state=3): >>><<< 32134 1727204465.33966: stdout chunk (state=3): >>><<< 32134 1727204465.33984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204465.33996: handler run complete 32134 1727204465.34136: variable 'ansible_facts' from source: unknown 32134 1727204465.34242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204465.34587: variable 'ansible_facts' from source: unknown 32134 1727204465.34678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204465.34823: attempt loop complete, returning result 32134 1727204465.34829: _execute() done 32134 1727204465.34833: dumping result to json 32134 1727204465.34868: done dumping result, returning 32134 1727204465.34877: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-753f-5162-0000000005ee] 32134 1727204465.34882: sending task result for task 12b410aa-8751-753f-5162-0000000005ee ok: [managed-node2] 32134 1727204465.35681: no more pending results, returning what we have 32134 1727204465.35684: results queue empty 32134 1727204465.35685: checking for any_errors_fatal 32134 1727204465.35686: done checking for any_errors_fatal 32134 1727204465.35686: checking for max_fail_percentage 32134 1727204465.35687: done checking for max_fail_percentage 32134 1727204465.35688: checking to see if all hosts have failed and the running result is not ok 32134 1727204465.35691: done checking to see if all hosts have failed 32134 1727204465.35691: getting the remaining hosts for this loop 32134 1727204465.35692: done getting the remaining hosts for this loop 32134 1727204465.35695: getting the next task for host managed-node2 32134 1727204465.35699: done getting next task for host managed-node2 32134 1727204465.35701: ^ task is: TASK: meta (flush_handlers) 32134 1727204465.35702: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204465.35706: getting variables 32134 1727204465.35707: in VariableManager get_vars() 32134 1727204465.35726: Calling all_inventory to load vars for managed-node2 32134 1727204465.35728: Calling groups_inventory to load vars for managed-node2 32134 1727204465.35732: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204465.35743: Calling all_plugins_play to load vars for managed-node2 32134 1727204465.35746: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204465.35749: Calling groups_plugins_play to load vars for managed-node2 32134 1727204465.36777: done sending task result for task 12b410aa-8751-753f-5162-0000000005ee 32134 1727204465.36781: WORKER PROCESS EXITING 32134 1727204465.37071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204465.38680: done with get_vars() 32134 1727204465.38706: done getting variables 32134 1727204465.38764: in VariableManager get_vars() 32134 1727204465.38774: Calling all_inventory to load vars for managed-node2 32134 1727204465.38777: Calling groups_inventory to load vars for managed-node2 32134 1727204465.38779: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204465.38783: Calling all_plugins_play to load vars for managed-node2 32134 1727204465.38785: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204465.38787: Calling groups_plugins_play to load vars for managed-node2 32134 1727204465.39870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204465.41462: done with get_vars() 32134 1727204465.41491: done queuing things up, now waiting for results queue to drain 32134 1727204465.41493: results queue empty 32134 1727204465.41494: checking for any_errors_fatal 32134 1727204465.41498: done checking for any_errors_fatal 32134 1727204465.41499: checking for max_fail_percentage 32134 1727204465.41499: done checking for max_fail_percentage 32134 1727204465.41500: checking to see if all hosts have failed and the running result is not ok 32134 1727204465.41505: done checking to see if all hosts have failed 32134 1727204465.41505: getting the remaining hosts for this loop 32134 1727204465.41506: done getting the remaining hosts for this loop 32134 1727204465.41508: getting the next task for host managed-node2 32134 1727204465.41512: done getting next task for host managed-node2 32134 1727204465.41514: ^ task is: TASK: Include the task 'delete_interface.yml' 32134 1727204465.41516: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204465.41517: getting variables 32134 1727204465.41518: in VariableManager get_vars() 32134 1727204465.41526: Calling all_inventory to load vars for managed-node2 32134 1727204465.41527: Calling groups_inventory to load vars for managed-node2 32134 1727204465.41529: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204465.41534: Calling all_plugins_play to load vars for managed-node2 32134 1727204465.41535: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204465.41538: Calling groups_plugins_play to load vars for managed-node2 32134 1727204465.42709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204465.44275: done with get_vars() 32134 1727204465.44297: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:83 Tuesday 24 September 2024 15:01:05 -0400 (0:00:01.044) 0:00:39.847 ***** 32134 1727204465.44364: entering _queue_task() for managed-node2/include_tasks 32134 1727204465.44640: worker is 1 (out of 1 available) 32134 1727204465.44655: exiting _queue_task() for managed-node2/include_tasks 32134 1727204465.44669: done queuing things up, now waiting for results queue to drain 32134 1727204465.44671: waiting for pending results... 32134 1727204465.44872: running TaskExecutor() for managed-node2/TASK: Include the task 'delete_interface.yml' 32134 1727204465.44965: in run() - task 12b410aa-8751-753f-5162-00000000009c 32134 1727204465.44976: variable 'ansible_search_path' from source: unknown 32134 1727204465.45020: calling self._execute() 32134 1727204465.45099: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204465.45106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204465.45122: variable 'omit' from source: magic vars 32134 1727204465.45453: variable 'ansible_distribution_major_version' from source: facts 32134 1727204465.45465: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204465.45471: _execute() done 32134 1727204465.45475: dumping result to json 32134 1727204465.45480: done dumping result, returning 32134 1727204465.45487: done running TaskExecutor() for managed-node2/TASK: Include the task 'delete_interface.yml' [12b410aa-8751-753f-5162-00000000009c] 32134 1727204465.45494: sending task result for task 12b410aa-8751-753f-5162-00000000009c 32134 1727204465.45602: done sending task result for task 12b410aa-8751-753f-5162-00000000009c 32134 1727204465.45605: WORKER PROCESS EXITING 32134 1727204465.45636: no more pending results, returning what we have 32134 1727204465.45641: in VariableManager get_vars() 32134 1727204465.45676: Calling all_inventory to load vars for managed-node2 32134 1727204465.45679: Calling groups_inventory to load vars for managed-node2 32134 1727204465.45683: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204465.45699: Calling all_plugins_play to load vars for managed-node2 32134 1727204465.45703: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204465.45707: Calling groups_plugins_play to load vars for managed-node2 32134 1727204465.47051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204465.48651: done with get_vars() 32134 1727204465.48670: variable 'ansible_search_path' from source: unknown 32134 1727204465.48682: we have included files to process 32134 1727204465.48683: generating all_blocks data 32134 1727204465.48684: done generating all_blocks data 32134 1727204465.48685: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 32134 1727204465.48686: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 32134 1727204465.48687: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 32134 1727204465.48879: done processing included file 32134 1727204465.48881: iterating over new_blocks loaded from include file 32134 1727204465.48882: in VariableManager get_vars() 32134 1727204465.48893: done with get_vars() 32134 1727204465.48894: filtering new block on tags 32134 1727204465.48906: done filtering new block on tags 32134 1727204465.48907: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node2 32134 1727204465.48912: extending task lists for all hosts with included blocks 32134 1727204465.48977: done extending task lists 32134 1727204465.48978: done processing included files 32134 1727204465.48979: results queue empty 32134 1727204465.48979: checking for any_errors_fatal 32134 1727204465.48980: done checking for any_errors_fatal 32134 1727204465.48981: checking for max_fail_percentage 32134 1727204465.48982: done checking for max_fail_percentage 32134 1727204465.48982: checking to see if all hosts have failed and the running result is not ok 32134 1727204465.48983: done checking to see if all hosts have failed 32134 1727204465.48983: getting the remaining hosts for this loop 32134 1727204465.48984: done getting the remaining hosts for this loop 32134 1727204465.48986: getting the next task for host managed-node2 32134 1727204465.48991: done getting next task for host managed-node2 32134 1727204465.48992: ^ task is: TASK: Remove test interface if necessary 32134 1727204465.48994: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204465.48996: getting variables 32134 1727204465.48997: in VariableManager get_vars() 32134 1727204465.49003: Calling all_inventory to load vars for managed-node2 32134 1727204465.49005: Calling groups_inventory to load vars for managed-node2 32134 1727204465.49007: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204465.49012: Calling all_plugins_play to load vars for managed-node2 32134 1727204465.49014: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204465.49016: Calling groups_plugins_play to load vars for managed-node2 32134 1727204465.50119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204465.51704: done with get_vars() 32134 1727204465.51730: done getting variables 32134 1727204465.51769: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 15:01:05 -0400 (0:00:00.074) 0:00:39.921 ***** 32134 1727204465.51797: entering _queue_task() for managed-node2/command 32134 1727204465.52073: worker is 1 (out of 1 available) 32134 1727204465.52092: exiting _queue_task() for managed-node2/command 32134 1727204465.52106: done queuing things up, now waiting for results queue to drain 32134 1727204465.52108: waiting for pending results... 32134 1727204465.52298: running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary 32134 1727204465.52391: in run() - task 12b410aa-8751-753f-5162-0000000005ff 32134 1727204465.52403: variable 'ansible_search_path' from source: unknown 32134 1727204465.52407: variable 'ansible_search_path' from source: unknown 32134 1727204465.52441: calling self._execute() 32134 1727204465.52521: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204465.52528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204465.52540: variable 'omit' from source: magic vars 32134 1727204465.52868: variable 'ansible_distribution_major_version' from source: facts 32134 1727204465.52880: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204465.52889: variable 'omit' from source: magic vars 32134 1727204465.52928: variable 'omit' from source: magic vars 32134 1727204465.53010: variable 'interface' from source: set_fact 32134 1727204465.53027: variable 'omit' from source: magic vars 32134 1727204465.53065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204465.53097: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204465.53122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204465.53138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204465.53152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204465.53180: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204465.53183: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204465.53190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204465.53277: Set connection var ansible_timeout to 10 32134 1727204465.53291: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204465.53294: Set connection var ansible_connection to ssh 32134 1727204465.53297: Set connection var ansible_shell_type to sh 32134 1727204465.53305: Set connection var ansible_shell_executable to /bin/sh 32134 1727204465.53311: Set connection var ansible_pipelining to False 32134 1727204465.53337: variable 'ansible_shell_executable' from source: unknown 32134 1727204465.53341: variable 'ansible_connection' from source: unknown 32134 1727204465.53343: variable 'ansible_module_compression' from source: unknown 32134 1727204465.53348: variable 'ansible_shell_type' from source: unknown 32134 1727204465.53350: variable 'ansible_shell_executable' from source: unknown 32134 1727204465.53355: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204465.53360: variable 'ansible_pipelining' from source: unknown 32134 1727204465.53364: variable 'ansible_timeout' from source: unknown 32134 1727204465.53369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204465.53490: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204465.53500: variable 'omit' from source: magic vars 32134 1727204465.53506: starting attempt loop 32134 1727204465.53509: running the handler 32134 1727204465.53528: _low_level_execute_command(): starting 32134 1727204465.53537: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204465.54094: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204465.54100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204465.54103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204465.54105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204465.54163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204465.54170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204465.54173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204465.54219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204465.55993: stdout chunk (state=3): >>>/root <<< 32134 1727204465.56103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204465.56162: stderr chunk (state=3): >>><<< 32134 1727204465.56167: stdout chunk (state=3): >>><<< 32134 1727204465.56191: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204465.56205: _low_level_execute_command(): starting 32134 1727204465.56214: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204465.5619133-33892-251714415584792 `" && echo ansible-tmp-1727204465.5619133-33892-251714415584792="` echo /root/.ansible/tmp/ansible-tmp-1727204465.5619133-33892-251714415584792 `" ) && sleep 0' 32134 1727204465.56681: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204465.56684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204465.56687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32134 1727204465.56699: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204465.56704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204465.56752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204465.56758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204465.56800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204465.58876: stdout chunk (state=3): >>>ansible-tmp-1727204465.5619133-33892-251714415584792=/root/.ansible/tmp/ansible-tmp-1727204465.5619133-33892-251714415584792 <<< 32134 1727204465.58998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204465.59048: stderr chunk (state=3): >>><<< 32134 1727204465.59051: stdout chunk (state=3): >>><<< 32134 1727204465.59068: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204465.5619133-33892-251714415584792=/root/.ansible/tmp/ansible-tmp-1727204465.5619133-33892-251714415584792 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204465.59097: variable 'ansible_module_compression' from source: unknown 32134 1727204465.59141: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32134 1727204465.59176: variable 'ansible_facts' from source: unknown 32134 1727204465.59239: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204465.5619133-33892-251714415584792/AnsiballZ_command.py 32134 1727204465.59351: Sending initial data 32134 1727204465.59355: Sent initial data (156 bytes) 32134 1727204465.59827: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204465.59833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204465.59835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32134 1727204465.59837: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204465.59841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204465.59888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204465.59897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204465.59938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204465.61687: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 32134 1727204465.61695: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204465.61725: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204465.61766: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmp9ztqin6r /root/.ansible/tmp/ansible-tmp-1727204465.5619133-33892-251714415584792/AnsiballZ_command.py <<< 32134 1727204465.61771: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204465.5619133-33892-251714415584792/AnsiballZ_command.py" <<< 32134 1727204465.61803: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmp9ztqin6r" to remote "/root/.ansible/tmp/ansible-tmp-1727204465.5619133-33892-251714415584792/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204465.5619133-33892-251714415584792/AnsiballZ_command.py" <<< 32134 1727204465.62578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204465.62643: stderr chunk (state=3): >>><<< 32134 1727204465.62646: stdout chunk (state=3): >>><<< 32134 1727204465.62668: done transferring module to remote 32134 1727204465.62678: _low_level_execute_command(): starting 32134 1727204465.62683: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204465.5619133-33892-251714415584792/ /root/.ansible/tmp/ansible-tmp-1727204465.5619133-33892-251714415584792/AnsiballZ_command.py && sleep 0' 32134 1727204465.63140: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204465.63144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204465.63147: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204465.63149: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204465.63151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204465.63211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204465.63216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204465.63255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204465.65232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204465.65285: stderr chunk (state=3): >>><<< 32134 1727204465.65291: stdout chunk (state=3): >>><<< 32134 1727204465.65306: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204465.65310: _low_level_execute_command(): starting 32134 1727204465.65321: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204465.5619133-33892-251714415584792/AnsiballZ_command.py && sleep 0' 32134 1727204465.65785: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204465.65790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204465.65804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204465.65806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204465.65850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204465.65853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204465.65914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204465.85053: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 15:01:05.837935", "end": "2024-09-24 15:01:05.846870", "delta": "0:00:00.008935", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32134 1727204465.87301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204465.87367: stderr chunk (state=3): >>><<< 32134 1727204465.87370: stdout chunk (state=3): >>><<< 32134 1727204465.87391: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 15:01:05.837935", "end": "2024-09-24 15:01:05.846870", "delta": "0:00:00.008935", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204465.87430: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204465.5619133-33892-251714415584792/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204465.87442: _low_level_execute_command(): starting 32134 1727204465.87448: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204465.5619133-33892-251714415584792/ > /dev/null 2>&1 && sleep 0' 32134 1727204465.87943: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204465.87947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204465.87950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204465.87952: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204465.87954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204465.88015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204465.88019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204465.88061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204465.89981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204465.90031: stderr chunk (state=3): >>><<< 32134 1727204465.90035: stdout chunk (state=3): >>><<< 32134 1727204465.90051: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204465.90060: handler run complete 32134 1727204465.90080: Evaluated conditional (False): False 32134 1727204465.90091: attempt loop complete, returning result 32134 1727204465.90095: _execute() done 32134 1727204465.90099: dumping result to json 32134 1727204465.90105: done dumping result, returning 32134 1727204465.90117: done running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary [12b410aa-8751-753f-5162-0000000005ff] 32134 1727204465.90121: sending task result for task 12b410aa-8751-753f-5162-0000000005ff 32134 1727204465.90234: done sending task result for task 12b410aa-8751-753f-5162-0000000005ff 32134 1727204465.90238: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.008935", "end": "2024-09-24 15:01:05.846870", "rc": 0, "start": "2024-09-24 15:01:05.837935" } 32134 1727204465.90317: no more pending results, returning what we have 32134 1727204465.90321: results queue empty 32134 1727204465.90322: checking for any_errors_fatal 32134 1727204465.90324: done checking for any_errors_fatal 32134 1727204465.90325: checking for max_fail_percentage 32134 1727204465.90326: done checking for max_fail_percentage 32134 1727204465.90327: checking to see if all hosts have failed and the running result is not ok 32134 1727204465.90329: done checking to see if all hosts have failed 32134 1727204465.90329: getting the remaining hosts for this loop 32134 1727204465.90331: done getting the remaining hosts for this loop 32134 1727204465.90335: getting the next task for host managed-node2 32134 1727204465.90344: done getting next task for host managed-node2 32134 1727204465.90349: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 32134 1727204465.90352: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204465.90356: getting variables 32134 1727204465.90357: in VariableManager get_vars() 32134 1727204465.90388: Calling all_inventory to load vars for managed-node2 32134 1727204465.90394: Calling groups_inventory to load vars for managed-node2 32134 1727204465.90398: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204465.90409: Calling all_plugins_play to load vars for managed-node2 32134 1727204465.90412: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204465.90416: Calling groups_plugins_play to load vars for managed-node2 32134 1727204465.91747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204465.93352: done with get_vars() 32134 1727204465.93374: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:85 Tuesday 24 September 2024 15:01:05 -0400 (0:00:00.416) 0:00:40.338 ***** 32134 1727204465.93457: entering _queue_task() for managed-node2/include_tasks 32134 1727204465.93713: worker is 1 (out of 1 available) 32134 1727204465.93728: exiting _queue_task() for managed-node2/include_tasks 32134 1727204465.93740: done queuing things up, now waiting for results queue to drain 32134 1727204465.93742: waiting for pending results... 32134 1727204465.93937: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_absent.yml' 32134 1727204465.94020: in run() - task 12b410aa-8751-753f-5162-00000000009d 32134 1727204465.94034: variable 'ansible_search_path' from source: unknown 32134 1727204465.94066: calling self._execute() 32134 1727204465.94158: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204465.94166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204465.94175: variable 'omit' from source: magic vars 32134 1727204465.94506: variable 'ansible_distribution_major_version' from source: facts 32134 1727204465.94522: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204465.94530: _execute() done 32134 1727204465.94533: dumping result to json 32134 1727204465.94536: done dumping result, returning 32134 1727204465.94544: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_absent.yml' [12b410aa-8751-753f-5162-00000000009d] 32134 1727204465.94549: sending task result for task 12b410aa-8751-753f-5162-00000000009d 32134 1727204465.94649: done sending task result for task 12b410aa-8751-753f-5162-00000000009d 32134 1727204465.94652: WORKER PROCESS EXITING 32134 1727204465.94683: no more pending results, returning what we have 32134 1727204465.94688: in VariableManager get_vars() 32134 1727204465.94725: Calling all_inventory to load vars for managed-node2 32134 1727204465.94729: Calling groups_inventory to load vars for managed-node2 32134 1727204465.94733: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204465.94748: Calling all_plugins_play to load vars for managed-node2 32134 1727204465.94752: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204465.94755: Calling groups_plugins_play to load vars for managed-node2 32134 1727204465.96092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204465.97684: done with get_vars() 32134 1727204465.97708: variable 'ansible_search_path' from source: unknown 32134 1727204465.97720: we have included files to process 32134 1727204465.97721: generating all_blocks data 32134 1727204465.97722: done generating all_blocks data 32134 1727204465.97726: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 32134 1727204465.97727: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 32134 1727204465.97728: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 32134 1727204465.97857: in VariableManager get_vars() 32134 1727204465.97871: done with get_vars() 32134 1727204465.97965: done processing included file 32134 1727204465.97967: iterating over new_blocks loaded from include file 32134 1727204465.97969: in VariableManager get_vars() 32134 1727204465.97978: done with get_vars() 32134 1727204465.97979: filtering new block on tags 32134 1727204465.97994: done filtering new block on tags 32134 1727204465.97996: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node2 32134 1727204465.98000: extending task lists for all hosts with included blocks 32134 1727204465.98111: done extending task lists 32134 1727204465.98112: done processing included files 32134 1727204465.98113: results queue empty 32134 1727204465.98113: checking for any_errors_fatal 32134 1727204465.98117: done checking for any_errors_fatal 32134 1727204465.98118: checking for max_fail_percentage 32134 1727204465.98118: done checking for max_fail_percentage 32134 1727204465.98119: checking to see if all hosts have failed and the running result is not ok 32134 1727204465.98120: done checking to see if all hosts have failed 32134 1727204465.98121: getting the remaining hosts for this loop 32134 1727204465.98122: done getting the remaining hosts for this loop 32134 1727204465.98125: getting the next task for host managed-node2 32134 1727204465.98128: done getting next task for host managed-node2 32134 1727204465.98129: ^ task is: TASK: Include the task 'get_profile_stat.yml' 32134 1727204465.98132: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204465.98134: getting variables 32134 1727204465.98135: in VariableManager get_vars() 32134 1727204465.98143: Calling all_inventory to load vars for managed-node2 32134 1727204465.98144: Calling groups_inventory to load vars for managed-node2 32134 1727204465.98146: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204465.98150: Calling all_plugins_play to load vars for managed-node2 32134 1727204465.98152: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204465.98155: Calling groups_plugins_play to load vars for managed-node2 32134 1727204465.99261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204466.00833: done with get_vars() 32134 1727204466.00856: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 15:01:06 -0400 (0:00:00.074) 0:00:40.413 ***** 32134 1727204466.00917: entering _queue_task() for managed-node2/include_tasks 32134 1727204466.01185: worker is 1 (out of 1 available) 32134 1727204466.01200: exiting _queue_task() for managed-node2/include_tasks 32134 1727204466.01212: done queuing things up, now waiting for results queue to drain 32134 1727204466.01214: waiting for pending results... 32134 1727204466.01407: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 32134 1727204466.01493: in run() - task 12b410aa-8751-753f-5162-000000000612 32134 1727204466.01506: variable 'ansible_search_path' from source: unknown 32134 1727204466.01509: variable 'ansible_search_path' from source: unknown 32134 1727204466.01547: calling self._execute() 32134 1727204466.01627: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204466.01634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204466.01645: variable 'omit' from source: magic vars 32134 1727204466.01972: variable 'ansible_distribution_major_version' from source: facts 32134 1727204466.01985: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204466.01996: _execute() done 32134 1727204466.02000: dumping result to json 32134 1727204466.02003: done dumping result, returning 32134 1727204466.02010: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-753f-5162-000000000612] 32134 1727204466.02018: sending task result for task 12b410aa-8751-753f-5162-000000000612 32134 1727204466.02116: done sending task result for task 12b410aa-8751-753f-5162-000000000612 32134 1727204466.02119: WORKER PROCESS EXITING 32134 1727204466.02149: no more pending results, returning what we have 32134 1727204466.02154: in VariableManager get_vars() 32134 1727204466.02192: Calling all_inventory to load vars for managed-node2 32134 1727204466.02196: Calling groups_inventory to load vars for managed-node2 32134 1727204466.02200: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204466.02214: Calling all_plugins_play to load vars for managed-node2 32134 1727204466.02218: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204466.02221: Calling groups_plugins_play to load vars for managed-node2 32134 1727204466.03533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204466.05126: done with get_vars() 32134 1727204466.05151: variable 'ansible_search_path' from source: unknown 32134 1727204466.05152: variable 'ansible_search_path' from source: unknown 32134 1727204466.05183: we have included files to process 32134 1727204466.05184: generating all_blocks data 32134 1727204466.05185: done generating all_blocks data 32134 1727204466.05186: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 32134 1727204466.05187: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 32134 1727204466.05191: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 32134 1727204466.06044: done processing included file 32134 1727204466.06046: iterating over new_blocks loaded from include file 32134 1727204466.06047: in VariableManager get_vars() 32134 1727204466.06059: done with get_vars() 32134 1727204466.06060: filtering new block on tags 32134 1727204466.06079: done filtering new block on tags 32134 1727204466.06081: in VariableManager get_vars() 32134 1727204466.06092: done with get_vars() 32134 1727204466.06093: filtering new block on tags 32134 1727204466.06111: done filtering new block on tags 32134 1727204466.06114: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 32134 1727204466.06119: extending task lists for all hosts with included blocks 32134 1727204466.06195: done extending task lists 32134 1727204466.06196: done processing included files 32134 1727204466.06196: results queue empty 32134 1727204466.06197: checking for any_errors_fatal 32134 1727204466.06199: done checking for any_errors_fatal 32134 1727204466.06200: checking for max_fail_percentage 32134 1727204466.06201: done checking for max_fail_percentage 32134 1727204466.06202: checking to see if all hosts have failed and the running result is not ok 32134 1727204466.06202: done checking to see if all hosts have failed 32134 1727204466.06203: getting the remaining hosts for this loop 32134 1727204466.06204: done getting the remaining hosts for this loop 32134 1727204466.06206: getting the next task for host managed-node2 32134 1727204466.06209: done getting next task for host managed-node2 32134 1727204466.06211: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 32134 1727204466.06215: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204466.06217: getting variables 32134 1727204466.06218: in VariableManager get_vars() 32134 1727204466.06273: Calling all_inventory to load vars for managed-node2 32134 1727204466.06276: Calling groups_inventory to load vars for managed-node2 32134 1727204466.06278: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204466.06283: Calling all_plugins_play to load vars for managed-node2 32134 1727204466.06285: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204466.06287: Calling groups_plugins_play to load vars for managed-node2 32134 1727204466.07387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204466.08968: done with get_vars() 32134 1727204466.08991: done getting variables 32134 1727204466.09024: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:01:06 -0400 (0:00:00.081) 0:00:40.494 ***** 32134 1727204466.09051: entering _queue_task() for managed-node2/set_fact 32134 1727204466.09329: worker is 1 (out of 1 available) 32134 1727204466.09343: exiting _queue_task() for managed-node2/set_fact 32134 1727204466.09355: done queuing things up, now waiting for results queue to drain 32134 1727204466.09357: waiting for pending results... 32134 1727204466.09558: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 32134 1727204466.09652: in run() - task 12b410aa-8751-753f-5162-00000000062a 32134 1727204466.09665: variable 'ansible_search_path' from source: unknown 32134 1727204466.09669: variable 'ansible_search_path' from source: unknown 32134 1727204466.09704: calling self._execute() 32134 1727204466.09783: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204466.09791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204466.09803: variable 'omit' from source: magic vars 32134 1727204466.10136: variable 'ansible_distribution_major_version' from source: facts 32134 1727204466.10147: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204466.10158: variable 'omit' from source: magic vars 32134 1727204466.10198: variable 'omit' from source: magic vars 32134 1727204466.10231: variable 'omit' from source: magic vars 32134 1727204466.10268: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204466.10301: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204466.10324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204466.10340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204466.10353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204466.10384: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204466.10387: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204466.10394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204466.10484: Set connection var ansible_timeout to 10 32134 1727204466.10495: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204466.10499: Set connection var ansible_connection to ssh 32134 1727204466.10502: Set connection var ansible_shell_type to sh 32134 1727204466.10509: Set connection var ansible_shell_executable to /bin/sh 32134 1727204466.10518: Set connection var ansible_pipelining to False 32134 1727204466.10537: variable 'ansible_shell_executable' from source: unknown 32134 1727204466.10540: variable 'ansible_connection' from source: unknown 32134 1727204466.10543: variable 'ansible_module_compression' from source: unknown 32134 1727204466.10548: variable 'ansible_shell_type' from source: unknown 32134 1727204466.10550: variable 'ansible_shell_executable' from source: unknown 32134 1727204466.10555: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204466.10560: variable 'ansible_pipelining' from source: unknown 32134 1727204466.10563: variable 'ansible_timeout' from source: unknown 32134 1727204466.10568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204466.10689: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204466.10701: variable 'omit' from source: magic vars 32134 1727204466.10706: starting attempt loop 32134 1727204466.10710: running the handler 32134 1727204466.10725: handler run complete 32134 1727204466.10736: attempt loop complete, returning result 32134 1727204466.10739: _execute() done 32134 1727204466.10744: dumping result to json 32134 1727204466.10749: done dumping result, returning 32134 1727204466.10756: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-753f-5162-00000000062a] 32134 1727204466.10761: sending task result for task 12b410aa-8751-753f-5162-00000000062a 32134 1727204466.10846: done sending task result for task 12b410aa-8751-753f-5162-00000000062a 32134 1727204466.10849: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 32134 1727204466.10909: no more pending results, returning what we have 32134 1727204466.10913: results queue empty 32134 1727204466.10914: checking for any_errors_fatal 32134 1727204466.10916: done checking for any_errors_fatal 32134 1727204466.10916: checking for max_fail_percentage 32134 1727204466.10918: done checking for max_fail_percentage 32134 1727204466.10920: checking to see if all hosts have failed and the running result is not ok 32134 1727204466.10921: done checking to see if all hosts have failed 32134 1727204466.10922: getting the remaining hosts for this loop 32134 1727204466.10923: done getting the remaining hosts for this loop 32134 1727204466.10927: getting the next task for host managed-node2 32134 1727204466.10935: done getting next task for host managed-node2 32134 1727204466.10938: ^ task is: TASK: Stat profile file 32134 1727204466.10944: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204466.10949: getting variables 32134 1727204466.10950: in VariableManager get_vars() 32134 1727204466.10980: Calling all_inventory to load vars for managed-node2 32134 1727204466.10983: Calling groups_inventory to load vars for managed-node2 32134 1727204466.10987: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204466.11006: Calling all_plugins_play to load vars for managed-node2 32134 1727204466.11010: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204466.11013: Calling groups_plugins_play to load vars for managed-node2 32134 1727204466.12254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204466.13945: done with get_vars() 32134 1727204466.13967: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:01:06 -0400 (0:00:00.049) 0:00:40.544 ***** 32134 1727204466.14047: entering _queue_task() for managed-node2/stat 32134 1727204466.14312: worker is 1 (out of 1 available) 32134 1727204466.14328: exiting _queue_task() for managed-node2/stat 32134 1727204466.14340: done queuing things up, now waiting for results queue to drain 32134 1727204466.14343: waiting for pending results... 32134 1727204466.14541: running TaskExecutor() for managed-node2/TASK: Stat profile file 32134 1727204466.14636: in run() - task 12b410aa-8751-753f-5162-00000000062b 32134 1727204466.14649: variable 'ansible_search_path' from source: unknown 32134 1727204466.14652: variable 'ansible_search_path' from source: unknown 32134 1727204466.14685: calling self._execute() 32134 1727204466.14769: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204466.14775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204466.14790: variable 'omit' from source: magic vars 32134 1727204466.15119: variable 'ansible_distribution_major_version' from source: facts 32134 1727204466.15132: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204466.15140: variable 'omit' from source: magic vars 32134 1727204466.15181: variable 'omit' from source: magic vars 32134 1727204466.15269: variable 'profile' from source: include params 32134 1727204466.15274: variable 'interface' from source: set_fact 32134 1727204466.15341: variable 'interface' from source: set_fact 32134 1727204466.15358: variable 'omit' from source: magic vars 32134 1727204466.15397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204466.15431: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204466.15454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204466.15471: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204466.15482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204466.15511: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204466.15518: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204466.15522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204466.15611: Set connection var ansible_timeout to 10 32134 1727204466.15626: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204466.15629: Set connection var ansible_connection to ssh 32134 1727204466.15632: Set connection var ansible_shell_type to sh 32134 1727204466.15639: Set connection var ansible_shell_executable to /bin/sh 32134 1727204466.15645: Set connection var ansible_pipelining to False 32134 1727204466.15667: variable 'ansible_shell_executable' from source: unknown 32134 1727204466.15671: variable 'ansible_connection' from source: unknown 32134 1727204466.15675: variable 'ansible_module_compression' from source: unknown 32134 1727204466.15678: variable 'ansible_shell_type' from source: unknown 32134 1727204466.15680: variable 'ansible_shell_executable' from source: unknown 32134 1727204466.15686: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204466.15690: variable 'ansible_pipelining' from source: unknown 32134 1727204466.15696: variable 'ansible_timeout' from source: unknown 32134 1727204466.15701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204466.15873: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204466.15884: variable 'omit' from source: magic vars 32134 1727204466.15888: starting attempt loop 32134 1727204466.15893: running the handler 32134 1727204466.15909: _low_level_execute_command(): starting 32134 1727204466.15920: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204466.16477: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204466.16483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204466.16487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204466.16542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204466.16546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204466.16600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204466.18345: stdout chunk (state=3): >>>/root <<< 32134 1727204466.18450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204466.18509: stderr chunk (state=3): >>><<< 32134 1727204466.18516: stdout chunk (state=3): >>><<< 32134 1727204466.18534: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204466.18547: _low_level_execute_command(): starting 32134 1727204466.18553: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204466.1853428-33903-77376668357334 `" && echo ansible-tmp-1727204466.1853428-33903-77376668357334="` echo /root/.ansible/tmp/ansible-tmp-1727204466.1853428-33903-77376668357334 `" ) && sleep 0' 32134 1727204466.19014: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204466.19018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204466.19021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204466.19033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204466.19086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204466.19094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204466.19130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204466.21172: stdout chunk (state=3): >>>ansible-tmp-1727204466.1853428-33903-77376668357334=/root/.ansible/tmp/ansible-tmp-1727204466.1853428-33903-77376668357334 <<< 32134 1727204466.21291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204466.21342: stderr chunk (state=3): >>><<< 32134 1727204466.21345: stdout chunk (state=3): >>><<< 32134 1727204466.21360: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204466.1853428-33903-77376668357334=/root/.ansible/tmp/ansible-tmp-1727204466.1853428-33903-77376668357334 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204466.21402: variable 'ansible_module_compression' from source: unknown 32134 1727204466.21451: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 32134 1727204466.21487: variable 'ansible_facts' from source: unknown 32134 1727204466.21552: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204466.1853428-33903-77376668357334/AnsiballZ_stat.py 32134 1727204466.21665: Sending initial data 32134 1727204466.21669: Sent initial data (152 bytes) 32134 1727204466.22136: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204466.22139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204466.22142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32134 1727204466.22144: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204466.22147: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204466.22201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204466.22204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204466.22246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204466.23948: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204466.23983: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204466.24021: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpsrhwal8z /root/.ansible/tmp/ansible-tmp-1727204466.1853428-33903-77376668357334/AnsiballZ_stat.py <<< 32134 1727204466.24025: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204466.1853428-33903-77376668357334/AnsiballZ_stat.py" <<< 32134 1727204466.24056: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpsrhwal8z" to remote "/root/.ansible/tmp/ansible-tmp-1727204466.1853428-33903-77376668357334/AnsiballZ_stat.py" <<< 32134 1727204466.24062: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204466.1853428-33903-77376668357334/AnsiballZ_stat.py" <<< 32134 1727204466.24831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204466.24902: stderr chunk (state=3): >>><<< 32134 1727204466.24906: stdout chunk (state=3): >>><<< 32134 1727204466.24927: done transferring module to remote 32134 1727204466.24939: _low_level_execute_command(): starting 32134 1727204466.24944: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204466.1853428-33903-77376668357334/ /root/.ansible/tmp/ansible-tmp-1727204466.1853428-33903-77376668357334/AnsiballZ_stat.py && sleep 0' 32134 1727204466.25413: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204466.25418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204466.25420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204466.25423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204466.25477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204466.25487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204466.25522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204466.27447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204466.27500: stderr chunk (state=3): >>><<< 32134 1727204466.27504: stdout chunk (state=3): >>><<< 32134 1727204466.27520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204466.27523: _low_level_execute_command(): starting 32134 1727204466.27530: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204466.1853428-33903-77376668357334/AnsiballZ_stat.py && sleep 0' 32134 1727204466.27998: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204466.28001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204466.28004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204466.28006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204466.28009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204466.28061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204466.28065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204466.28119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204466.45760: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 32134 1727204466.47396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204466.47400: stdout chunk (state=3): >>><<< 32134 1727204466.47403: stderr chunk (state=3): >>><<< 32134 1727204466.47406: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204466.47409: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204466.1853428-33903-77376668357334/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204466.47421: _low_level_execute_command(): starting 32134 1727204466.47431: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204466.1853428-33903-77376668357334/ > /dev/null 2>&1 && sleep 0' 32134 1727204466.48100: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204466.48118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204466.48133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204466.48163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204466.48275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 32134 1727204466.48294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204466.48316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204466.48392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204466.50484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204466.50499: stdout chunk (state=3): >>><<< 32134 1727204466.50512: stderr chunk (state=3): >>><<< 32134 1727204466.50558: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204466.50572: handler run complete 32134 1727204466.50695: attempt loop complete, returning result 32134 1727204466.50698: _execute() done 32134 1727204466.50701: dumping result to json 32134 1727204466.50703: done dumping result, returning 32134 1727204466.50705: done running TaskExecutor() for managed-node2/TASK: Stat profile file [12b410aa-8751-753f-5162-00000000062b] 32134 1727204466.50708: sending task result for task 12b410aa-8751-753f-5162-00000000062b 32134 1727204466.50797: done sending task result for task 12b410aa-8751-753f-5162-00000000062b 32134 1727204466.50800: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 32134 1727204466.50880: no more pending results, returning what we have 32134 1727204466.50885: results queue empty 32134 1727204466.50886: checking for any_errors_fatal 32134 1727204466.50902: done checking for any_errors_fatal 32134 1727204466.50903: checking for max_fail_percentage 32134 1727204466.50906: done checking for max_fail_percentage 32134 1727204466.50907: checking to see if all hosts have failed and the running result is not ok 32134 1727204466.50908: done checking to see if all hosts have failed 32134 1727204466.50909: getting the remaining hosts for this loop 32134 1727204466.50911: done getting the remaining hosts for this loop 32134 1727204466.50916: getting the next task for host managed-node2 32134 1727204466.50925: done getting next task for host managed-node2 32134 1727204466.50929: ^ task is: TASK: Set NM profile exist flag based on the profile files 32134 1727204466.50935: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204466.50940: getting variables 32134 1727204466.50942: in VariableManager get_vars() 32134 1727204466.50980: Calling all_inventory to load vars for managed-node2 32134 1727204466.50983: Calling groups_inventory to load vars for managed-node2 32134 1727204466.50988: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204466.51209: Calling all_plugins_play to load vars for managed-node2 32134 1727204466.51213: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204466.51218: Calling groups_plugins_play to load vars for managed-node2 32134 1727204466.57902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204466.59955: done with get_vars() 32134 1727204466.59996: done getting variables 32134 1727204466.60080: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:01:06 -0400 (0:00:00.460) 0:00:41.005 ***** 32134 1727204466.60113: entering _queue_task() for managed-node2/set_fact 32134 1727204466.60401: worker is 1 (out of 1 available) 32134 1727204466.60414: exiting _queue_task() for managed-node2/set_fact 32134 1727204466.60426: done queuing things up, now waiting for results queue to drain 32134 1727204466.60428: waiting for pending results... 32134 1727204466.60627: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 32134 1727204466.60743: in run() - task 12b410aa-8751-753f-5162-00000000062c 32134 1727204466.60756: variable 'ansible_search_path' from source: unknown 32134 1727204466.60762: variable 'ansible_search_path' from source: unknown 32134 1727204466.60797: calling self._execute() 32134 1727204466.60883: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204466.60887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204466.60900: variable 'omit' from source: magic vars 32134 1727204466.61247: variable 'ansible_distribution_major_version' from source: facts 32134 1727204466.61258: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204466.61364: variable 'profile_stat' from source: set_fact 32134 1727204466.61377: Evaluated conditional (profile_stat.stat.exists): False 32134 1727204466.61381: when evaluation is False, skipping this task 32134 1727204466.61384: _execute() done 32134 1727204466.61386: dumping result to json 32134 1727204466.61394: done dumping result, returning 32134 1727204466.61401: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-753f-5162-00000000062c] 32134 1727204466.61407: sending task result for task 12b410aa-8751-753f-5162-00000000062c 32134 1727204466.61510: done sending task result for task 12b410aa-8751-753f-5162-00000000062c 32134 1727204466.61513: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32134 1727204466.61591: no more pending results, returning what we have 32134 1727204466.61596: results queue empty 32134 1727204466.61598: checking for any_errors_fatal 32134 1727204466.61609: done checking for any_errors_fatal 32134 1727204466.61610: checking for max_fail_percentage 32134 1727204466.61611: done checking for max_fail_percentage 32134 1727204466.61613: checking to see if all hosts have failed and the running result is not ok 32134 1727204466.61614: done checking to see if all hosts have failed 32134 1727204466.61615: getting the remaining hosts for this loop 32134 1727204466.61616: done getting the remaining hosts for this loop 32134 1727204466.61620: getting the next task for host managed-node2 32134 1727204466.61628: done getting next task for host managed-node2 32134 1727204466.61631: ^ task is: TASK: Get NM profile info 32134 1727204466.61637: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204466.61641: getting variables 32134 1727204466.61643: in VariableManager get_vars() 32134 1727204466.61671: Calling all_inventory to load vars for managed-node2 32134 1727204466.61674: Calling groups_inventory to load vars for managed-node2 32134 1727204466.61678: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204466.61691: Calling all_plugins_play to load vars for managed-node2 32134 1727204466.61694: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204466.61698: Calling groups_plugins_play to load vars for managed-node2 32134 1727204466.63863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204466.65597: done with get_vars() 32134 1727204466.65645: done getting variables 32134 1727204466.65720: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:01:06 -0400 (0:00:00.056) 0:00:41.061 ***** 32134 1727204466.65758: entering _queue_task() for managed-node2/shell 32134 1727204466.66154: worker is 1 (out of 1 available) 32134 1727204466.66168: exiting _queue_task() for managed-node2/shell 32134 1727204466.66180: done queuing things up, now waiting for results queue to drain 32134 1727204466.66182: waiting for pending results... 32134 1727204466.66521: running TaskExecutor() for managed-node2/TASK: Get NM profile info 32134 1727204466.66795: in run() - task 12b410aa-8751-753f-5162-00000000062d 32134 1727204466.66799: variable 'ansible_search_path' from source: unknown 32134 1727204466.66802: variable 'ansible_search_path' from source: unknown 32134 1727204466.66804: calling self._execute() 32134 1727204466.66863: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204466.66877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204466.66898: variable 'omit' from source: magic vars 32134 1727204466.67380: variable 'ansible_distribution_major_version' from source: facts 32134 1727204466.67402: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204466.67418: variable 'omit' from source: magic vars 32134 1727204466.67488: variable 'omit' from source: magic vars 32134 1727204466.67598: variable 'profile' from source: include params 32134 1727204466.67608: variable 'interface' from source: set_fact 32134 1727204466.67765: variable 'interface' from source: set_fact 32134 1727204466.67768: variable 'omit' from source: magic vars 32134 1727204466.67771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204466.67813: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204466.67845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204466.67870: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204466.67876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204466.68094: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204466.68098: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204466.68101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204466.68104: Set connection var ansible_timeout to 10 32134 1727204466.68106: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204466.68108: Set connection var ansible_connection to ssh 32134 1727204466.68113: Set connection var ansible_shell_type to sh 32134 1727204466.68115: Set connection var ansible_shell_executable to /bin/sh 32134 1727204466.68118: Set connection var ansible_pipelining to False 32134 1727204466.68140: variable 'ansible_shell_executable' from source: unknown 32134 1727204466.68149: variable 'ansible_connection' from source: unknown 32134 1727204466.68156: variable 'ansible_module_compression' from source: unknown 32134 1727204466.68167: variable 'ansible_shell_type' from source: unknown 32134 1727204466.68176: variable 'ansible_shell_executable' from source: unknown 32134 1727204466.68184: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204466.68197: variable 'ansible_pipelining' from source: unknown 32134 1727204466.68205: variable 'ansible_timeout' from source: unknown 32134 1727204466.68218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204466.68394: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204466.68420: variable 'omit' from source: magic vars 32134 1727204466.68431: starting attempt loop 32134 1727204466.68438: running the handler 32134 1727204466.68454: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204466.68482: _low_level_execute_command(): starting 32134 1727204466.68498: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204466.69227: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204466.69246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204466.69259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204466.69305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204466.69327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204466.69374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204466.71200: stdout chunk (state=3): >>>/root <<< 32134 1727204466.71399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204466.71403: stdout chunk (state=3): >>><<< 32134 1727204466.71405: stderr chunk (state=3): >>><<< 32134 1727204466.71408: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204466.71421: _low_level_execute_command(): starting 32134 1727204466.71432: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204466.7137954-33925-270928721822393 `" && echo ansible-tmp-1727204466.7137954-33925-270928721822393="` echo /root/.ansible/tmp/ansible-tmp-1727204466.7137954-33925-270928721822393 `" ) && sleep 0' 32134 1727204466.72025: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204466.72046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204466.72067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204466.72083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204466.72160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204466.72176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204466.72188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204466.72216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204466.72267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204466.74361: stdout chunk (state=3): >>>ansible-tmp-1727204466.7137954-33925-270928721822393=/root/.ansible/tmp/ansible-tmp-1727204466.7137954-33925-270928721822393 <<< 32134 1727204466.74476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204466.74535: stderr chunk (state=3): >>><<< 32134 1727204466.74538: stdout chunk (state=3): >>><<< 32134 1727204466.74556: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204466.7137954-33925-270928721822393=/root/.ansible/tmp/ansible-tmp-1727204466.7137954-33925-270928721822393 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204466.74586: variable 'ansible_module_compression' from source: unknown 32134 1727204466.74639: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32134 1727204466.74675: variable 'ansible_facts' from source: unknown 32134 1727204466.74734: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204466.7137954-33925-270928721822393/AnsiballZ_command.py 32134 1727204466.74848: Sending initial data 32134 1727204466.74852: Sent initial data (156 bytes) 32134 1727204466.75315: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204466.75318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204466.75321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204466.75323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204466.75377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204466.75380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204466.75421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204466.77055: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 32134 1727204466.77065: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204466.77095: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204466.77132: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpsdfx9iy0 /root/.ansible/tmp/ansible-tmp-1727204466.7137954-33925-270928721822393/AnsiballZ_command.py <<< 32134 1727204466.77140: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204466.7137954-33925-270928721822393/AnsiballZ_command.py" <<< 32134 1727204466.77175: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpsdfx9iy0" to remote "/root/.ansible/tmp/ansible-tmp-1727204466.7137954-33925-270928721822393/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204466.7137954-33925-270928721822393/AnsiballZ_command.py" <<< 32134 1727204466.77951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204466.78011: stderr chunk (state=3): >>><<< 32134 1727204466.78017: stdout chunk (state=3): >>><<< 32134 1727204466.78040: done transferring module to remote 32134 1727204466.78054: _low_level_execute_command(): starting 32134 1727204466.78060: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204466.7137954-33925-270928721822393/ /root/.ansible/tmp/ansible-tmp-1727204466.7137954-33925-270928721822393/AnsiballZ_command.py && sleep 0' 32134 1727204466.78519: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204466.78523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204466.78525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204466.78532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204466.78579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204466.78585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204466.78629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204466.80481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204466.80533: stderr chunk (state=3): >>><<< 32134 1727204466.80537: stdout chunk (state=3): >>><<< 32134 1727204466.80555: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204466.80558: _low_level_execute_command(): starting 32134 1727204466.80564: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204466.7137954-33925-270928721822393/AnsiballZ_command.py && sleep 0' 32134 1727204466.81022: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204466.81025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204466.81028: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204466.81030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204466.81085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204466.81094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204466.81140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204467.00735: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 15:01:06.987785", "end": "2024-09-24 15:01:07.006361", "delta": "0:00:00.018576", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32134 1727204467.02442: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 32134 1727204467.02505: stderr chunk (state=3): >>><<< 32134 1727204467.02509: stdout chunk (state=3): >>><<< 32134 1727204467.02531: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 15:01:06.987785", "end": "2024-09-24 15:01:07.006361", "delta": "0:00:00.018576", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 32134 1727204467.02566: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204466.7137954-33925-270928721822393/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204467.02578: _low_level_execute_command(): starting 32134 1727204467.02584: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204466.7137954-33925-270928721822393/ > /dev/null 2>&1 && sleep 0' 32134 1727204467.03076: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204467.03081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204467.03087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204467.03092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204467.03152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204467.03158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204467.03160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204467.03203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204467.05181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204467.05233: stderr chunk (state=3): >>><<< 32134 1727204467.05237: stdout chunk (state=3): >>><<< 32134 1727204467.05256: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204467.05265: handler run complete 32134 1727204467.05287: Evaluated conditional (False): False 32134 1727204467.05299: attempt loop complete, returning result 32134 1727204467.05302: _execute() done 32134 1727204467.05305: dumping result to json 32134 1727204467.05312: done dumping result, returning 32134 1727204467.05322: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [12b410aa-8751-753f-5162-00000000062d] 32134 1727204467.05327: sending task result for task 12b410aa-8751-753f-5162-00000000062d fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.018576", "end": "2024-09-24 15:01:07.006361", "rc": 1, "start": "2024-09-24 15:01:06.987785" } MSG: non-zero return code ...ignoring 32134 1727204467.05530: no more pending results, returning what we have 32134 1727204467.05534: results queue empty 32134 1727204467.05535: checking for any_errors_fatal 32134 1727204467.05543: done checking for any_errors_fatal 32134 1727204467.05544: checking for max_fail_percentage 32134 1727204467.05546: done checking for max_fail_percentage 32134 1727204467.05547: checking to see if all hosts have failed and the running result is not ok 32134 1727204467.05549: done checking to see if all hosts have failed 32134 1727204467.05550: getting the remaining hosts for this loop 32134 1727204467.05551: done getting the remaining hosts for this loop 32134 1727204467.05556: getting the next task for host managed-node2 32134 1727204467.05563: done getting next task for host managed-node2 32134 1727204467.05566: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 32134 1727204467.05571: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204467.05574: getting variables 32134 1727204467.05576: in VariableManager get_vars() 32134 1727204467.05617: Calling all_inventory to load vars for managed-node2 32134 1727204467.05621: Calling groups_inventory to load vars for managed-node2 32134 1727204467.05624: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204467.05632: done sending task result for task 12b410aa-8751-753f-5162-00000000062d 32134 1727204467.05635: WORKER PROCESS EXITING 32134 1727204467.05646: Calling all_plugins_play to load vars for managed-node2 32134 1727204467.05649: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204467.05652: Calling groups_plugins_play to load vars for managed-node2 32134 1727204467.06927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204467.08643: done with get_vars() 32134 1727204467.08666: done getting variables 32134 1727204467.08720: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:01:07 -0400 (0:00:00.429) 0:00:41.491 ***** 32134 1727204467.08749: entering _queue_task() for managed-node2/set_fact 32134 1727204467.09002: worker is 1 (out of 1 available) 32134 1727204467.09016: exiting _queue_task() for managed-node2/set_fact 32134 1727204467.09029: done queuing things up, now waiting for results queue to drain 32134 1727204467.09031: waiting for pending results... 32134 1727204467.09229: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 32134 1727204467.09336: in run() - task 12b410aa-8751-753f-5162-00000000062e 32134 1727204467.09351: variable 'ansible_search_path' from source: unknown 32134 1727204467.09355: variable 'ansible_search_path' from source: unknown 32134 1727204467.09393: calling self._execute() 32134 1727204467.09475: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204467.09480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204467.09495: variable 'omit' from source: magic vars 32134 1727204467.09836: variable 'ansible_distribution_major_version' from source: facts 32134 1727204467.09847: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204467.09962: variable 'nm_profile_exists' from source: set_fact 32134 1727204467.09976: Evaluated conditional (nm_profile_exists.rc == 0): False 32134 1727204467.09979: when evaluation is False, skipping this task 32134 1727204467.09982: _execute() done 32134 1727204467.09987: dumping result to json 32134 1727204467.09992: done dumping result, returning 32134 1727204467.10002: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-753f-5162-00000000062e] 32134 1727204467.10008: sending task result for task 12b410aa-8751-753f-5162-00000000062e 32134 1727204467.10115: done sending task result for task 12b410aa-8751-753f-5162-00000000062e 32134 1727204467.10118: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 32134 1727204467.10186: no more pending results, returning what we have 32134 1727204467.10192: results queue empty 32134 1727204467.10193: checking for any_errors_fatal 32134 1727204467.10208: done checking for any_errors_fatal 32134 1727204467.10209: checking for max_fail_percentage 32134 1727204467.10211: done checking for max_fail_percentage 32134 1727204467.10212: checking to see if all hosts have failed and the running result is not ok 32134 1727204467.10213: done checking to see if all hosts have failed 32134 1727204467.10214: getting the remaining hosts for this loop 32134 1727204467.10215: done getting the remaining hosts for this loop 32134 1727204467.10219: getting the next task for host managed-node2 32134 1727204467.10228: done getting next task for host managed-node2 32134 1727204467.10230: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 32134 1727204467.10235: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204467.10239: getting variables 32134 1727204467.10240: in VariableManager get_vars() 32134 1727204467.10267: Calling all_inventory to load vars for managed-node2 32134 1727204467.10270: Calling groups_inventory to load vars for managed-node2 32134 1727204467.10273: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204467.10284: Calling all_plugins_play to load vars for managed-node2 32134 1727204467.10287: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204467.10293: Calling groups_plugins_play to load vars for managed-node2 32134 1727204467.11503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204467.13111: done with get_vars() 32134 1727204467.13134: done getting variables 32134 1727204467.13186: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 32134 1727204467.13286: variable 'profile' from source: include params 32134 1727204467.13291: variable 'interface' from source: set_fact 32134 1727204467.13344: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:01:07 -0400 (0:00:00.046) 0:00:41.537 ***** 32134 1727204467.13372: entering _queue_task() for managed-node2/command 32134 1727204467.13615: worker is 1 (out of 1 available) 32134 1727204467.13630: exiting _queue_task() for managed-node2/command 32134 1727204467.13643: done queuing things up, now waiting for results queue to drain 32134 1727204467.13645: waiting for pending results... 32134 1727204467.13838: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-ethtest0 32134 1727204467.13935: in run() - task 12b410aa-8751-753f-5162-000000000630 32134 1727204467.13949: variable 'ansible_search_path' from source: unknown 32134 1727204467.13952: variable 'ansible_search_path' from source: unknown 32134 1727204467.13987: calling self._execute() 32134 1727204467.14069: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204467.14076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204467.14092: variable 'omit' from source: magic vars 32134 1727204467.14409: variable 'ansible_distribution_major_version' from source: facts 32134 1727204467.14424: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204467.14531: variable 'profile_stat' from source: set_fact 32134 1727204467.14545: Evaluated conditional (profile_stat.stat.exists): False 32134 1727204467.14549: when evaluation is False, skipping this task 32134 1727204467.14552: _execute() done 32134 1727204467.14555: dumping result to json 32134 1727204467.14561: done dumping result, returning 32134 1727204467.14568: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [12b410aa-8751-753f-5162-000000000630] 32134 1727204467.14574: sending task result for task 12b410aa-8751-753f-5162-000000000630 32134 1727204467.14670: done sending task result for task 12b410aa-8751-753f-5162-000000000630 32134 1727204467.14673: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32134 1727204467.14729: no more pending results, returning what we have 32134 1727204467.14733: results queue empty 32134 1727204467.14735: checking for any_errors_fatal 32134 1727204467.14744: done checking for any_errors_fatal 32134 1727204467.14745: checking for max_fail_percentage 32134 1727204467.14746: done checking for max_fail_percentage 32134 1727204467.14747: checking to see if all hosts have failed and the running result is not ok 32134 1727204467.14749: done checking to see if all hosts have failed 32134 1727204467.14749: getting the remaining hosts for this loop 32134 1727204467.14751: done getting the remaining hosts for this loop 32134 1727204467.14755: getting the next task for host managed-node2 32134 1727204467.14763: done getting next task for host managed-node2 32134 1727204467.14766: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 32134 1727204467.14770: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204467.14773: getting variables 32134 1727204467.14775: in VariableManager get_vars() 32134 1727204467.14803: Calling all_inventory to load vars for managed-node2 32134 1727204467.14806: Calling groups_inventory to load vars for managed-node2 32134 1727204467.14810: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204467.14821: Calling all_plugins_play to load vars for managed-node2 32134 1727204467.14824: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204467.14827: Calling groups_plugins_play to load vars for managed-node2 32134 1727204467.16180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204467.17758: done with get_vars() 32134 1727204467.17780: done getting variables 32134 1727204467.17833: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 32134 1727204467.17918: variable 'profile' from source: include params 32134 1727204467.17922: variable 'interface' from source: set_fact 32134 1727204467.17968: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:01:07 -0400 (0:00:00.046) 0:00:41.583 ***** 32134 1727204467.17995: entering _queue_task() for managed-node2/set_fact 32134 1727204467.18225: worker is 1 (out of 1 available) 32134 1727204467.18238: exiting _queue_task() for managed-node2/set_fact 32134 1727204467.18251: done queuing things up, now waiting for results queue to drain 32134 1727204467.18253: waiting for pending results... 32134 1727204467.18440: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 32134 1727204467.18538: in run() - task 12b410aa-8751-753f-5162-000000000631 32134 1727204467.18549: variable 'ansible_search_path' from source: unknown 32134 1727204467.18553: variable 'ansible_search_path' from source: unknown 32134 1727204467.18585: calling self._execute() 32134 1727204467.18668: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204467.18674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204467.18686: variable 'omit' from source: magic vars 32134 1727204467.19003: variable 'ansible_distribution_major_version' from source: facts 32134 1727204467.19013: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204467.19120: variable 'profile_stat' from source: set_fact 32134 1727204467.19133: Evaluated conditional (profile_stat.stat.exists): False 32134 1727204467.19139: when evaluation is False, skipping this task 32134 1727204467.19143: _execute() done 32134 1727204467.19146: dumping result to json 32134 1727204467.19148: done dumping result, returning 32134 1727204467.19160: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [12b410aa-8751-753f-5162-000000000631] 32134 1727204467.19162: sending task result for task 12b410aa-8751-753f-5162-000000000631 32134 1727204467.19253: done sending task result for task 12b410aa-8751-753f-5162-000000000631 32134 1727204467.19257: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32134 1727204467.19313: no more pending results, returning what we have 32134 1727204467.19317: results queue empty 32134 1727204467.19318: checking for any_errors_fatal 32134 1727204467.19323: done checking for any_errors_fatal 32134 1727204467.19324: checking for max_fail_percentage 32134 1727204467.19325: done checking for max_fail_percentage 32134 1727204467.19326: checking to see if all hosts have failed and the running result is not ok 32134 1727204467.19328: done checking to see if all hosts have failed 32134 1727204467.19328: getting the remaining hosts for this loop 32134 1727204467.19330: done getting the remaining hosts for this loop 32134 1727204467.19334: getting the next task for host managed-node2 32134 1727204467.19340: done getting next task for host managed-node2 32134 1727204467.19343: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 32134 1727204467.19347: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204467.19351: getting variables 32134 1727204467.19353: in VariableManager get_vars() 32134 1727204467.19381: Calling all_inventory to load vars for managed-node2 32134 1727204467.19384: Calling groups_inventory to load vars for managed-node2 32134 1727204467.19387: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204467.19406: Calling all_plugins_play to load vars for managed-node2 32134 1727204467.19410: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204467.19413: Calling groups_plugins_play to load vars for managed-node2 32134 1727204467.20611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204467.22311: done with get_vars() 32134 1727204467.22334: done getting variables 32134 1727204467.22383: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 32134 1727204467.22474: variable 'profile' from source: include params 32134 1727204467.22477: variable 'interface' from source: set_fact 32134 1727204467.22525: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:01:07 -0400 (0:00:00.045) 0:00:41.629 ***** 32134 1727204467.22552: entering _queue_task() for managed-node2/command 32134 1727204467.22806: worker is 1 (out of 1 available) 32134 1727204467.22820: exiting _queue_task() for managed-node2/command 32134 1727204467.22833: done queuing things up, now waiting for results queue to drain 32134 1727204467.22835: waiting for pending results... 32134 1727204467.23022: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-ethtest0 32134 1727204467.23123: in run() - task 12b410aa-8751-753f-5162-000000000632 32134 1727204467.23135: variable 'ansible_search_path' from source: unknown 32134 1727204467.23139: variable 'ansible_search_path' from source: unknown 32134 1727204467.23174: calling self._execute() 32134 1727204467.23266: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204467.23271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204467.23288: variable 'omit' from source: magic vars 32134 1727204467.23602: variable 'ansible_distribution_major_version' from source: facts 32134 1727204467.23616: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204467.23720: variable 'profile_stat' from source: set_fact 32134 1727204467.23736: Evaluated conditional (profile_stat.stat.exists): False 32134 1727204467.23740: when evaluation is False, skipping this task 32134 1727204467.23744: _execute() done 32134 1727204467.23748: dumping result to json 32134 1727204467.23751: done dumping result, returning 32134 1727204467.23754: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-ethtest0 [12b410aa-8751-753f-5162-000000000632] 32134 1727204467.23760: sending task result for task 12b410aa-8751-753f-5162-000000000632 32134 1727204467.23856: done sending task result for task 12b410aa-8751-753f-5162-000000000632 32134 1727204467.23859: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32134 1727204467.23918: no more pending results, returning what we have 32134 1727204467.23923: results queue empty 32134 1727204467.23924: checking for any_errors_fatal 32134 1727204467.23930: done checking for any_errors_fatal 32134 1727204467.23931: checking for max_fail_percentage 32134 1727204467.23933: done checking for max_fail_percentage 32134 1727204467.23934: checking to see if all hosts have failed and the running result is not ok 32134 1727204467.23935: done checking to see if all hosts have failed 32134 1727204467.23936: getting the remaining hosts for this loop 32134 1727204467.23938: done getting the remaining hosts for this loop 32134 1727204467.23942: getting the next task for host managed-node2 32134 1727204467.23951: done getting next task for host managed-node2 32134 1727204467.23953: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 32134 1727204467.23958: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204467.23961: getting variables 32134 1727204467.23963: in VariableManager get_vars() 32134 1727204467.23990: Calling all_inventory to load vars for managed-node2 32134 1727204467.23994: Calling groups_inventory to load vars for managed-node2 32134 1727204467.23997: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204467.24009: Calling all_plugins_play to load vars for managed-node2 32134 1727204467.24014: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204467.24018: Calling groups_plugins_play to load vars for managed-node2 32134 1727204467.25248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204467.26876: done with get_vars() 32134 1727204467.26904: done getting variables 32134 1727204467.26955: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 32134 1727204467.27049: variable 'profile' from source: include params 32134 1727204467.27052: variable 'interface' from source: set_fact 32134 1727204467.27099: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:01:07 -0400 (0:00:00.045) 0:00:41.675 ***** 32134 1727204467.27129: entering _queue_task() for managed-node2/set_fact 32134 1727204467.27391: worker is 1 (out of 1 available) 32134 1727204467.27404: exiting _queue_task() for managed-node2/set_fact 32134 1727204467.27419: done queuing things up, now waiting for results queue to drain 32134 1727204467.27421: waiting for pending results... 32134 1727204467.27609: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-ethtest0 32134 1727204467.27708: in run() - task 12b410aa-8751-753f-5162-000000000633 32134 1727204467.27722: variable 'ansible_search_path' from source: unknown 32134 1727204467.27726: variable 'ansible_search_path' from source: unknown 32134 1727204467.27763: calling self._execute() 32134 1727204467.27845: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204467.27851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204467.27868: variable 'omit' from source: magic vars 32134 1727204467.28185: variable 'ansible_distribution_major_version' from source: facts 32134 1727204467.28200: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204467.28304: variable 'profile_stat' from source: set_fact 32134 1727204467.28322: Evaluated conditional (profile_stat.stat.exists): False 32134 1727204467.28326: when evaluation is False, skipping this task 32134 1727204467.28329: _execute() done 32134 1727204467.28331: dumping result to json 32134 1727204467.28334: done dumping result, returning 32134 1727204467.28341: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [12b410aa-8751-753f-5162-000000000633] 32134 1727204467.28347: sending task result for task 12b410aa-8751-753f-5162-000000000633 32134 1727204467.28446: done sending task result for task 12b410aa-8751-753f-5162-000000000633 32134 1727204467.28449: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32134 1727204467.28504: no more pending results, returning what we have 32134 1727204467.28508: results queue empty 32134 1727204467.28510: checking for any_errors_fatal 32134 1727204467.28519: done checking for any_errors_fatal 32134 1727204467.28520: checking for max_fail_percentage 32134 1727204467.28522: done checking for max_fail_percentage 32134 1727204467.28523: checking to see if all hosts have failed and the running result is not ok 32134 1727204467.28524: done checking to see if all hosts have failed 32134 1727204467.28525: getting the remaining hosts for this loop 32134 1727204467.28527: done getting the remaining hosts for this loop 32134 1727204467.28531: getting the next task for host managed-node2 32134 1727204467.28540: done getting next task for host managed-node2 32134 1727204467.28543: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 32134 1727204467.28546: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204467.28550: getting variables 32134 1727204467.28552: in VariableManager get_vars() 32134 1727204467.28578: Calling all_inventory to load vars for managed-node2 32134 1727204467.28581: Calling groups_inventory to load vars for managed-node2 32134 1727204467.28585: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204467.28598: Calling all_plugins_play to load vars for managed-node2 32134 1727204467.28601: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204467.28605: Calling groups_plugins_play to load vars for managed-node2 32134 1727204467.30005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204467.31638: done with get_vars() 32134 1727204467.31672: done getting variables 32134 1727204467.31746: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 32134 1727204467.31879: variable 'profile' from source: include params 32134 1727204467.31883: variable 'interface' from source: set_fact 32134 1727204467.31960: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 15:01:07 -0400 (0:00:00.048) 0:00:41.723 ***** 32134 1727204467.31997: entering _queue_task() for managed-node2/assert 32134 1727204467.32349: worker is 1 (out of 1 available) 32134 1727204467.32363: exiting _queue_task() for managed-node2/assert 32134 1727204467.32376: done queuing things up, now waiting for results queue to drain 32134 1727204467.32378: waiting for pending results... 32134 1727204467.32814: running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'ethtest0' 32134 1727204467.32849: in run() - task 12b410aa-8751-753f-5162-000000000613 32134 1727204467.32874: variable 'ansible_search_path' from source: unknown 32134 1727204467.32885: variable 'ansible_search_path' from source: unknown 32134 1727204467.32943: calling self._execute() 32134 1727204467.33067: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204467.33084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204467.33128: variable 'omit' from source: magic vars 32134 1727204467.33575: variable 'ansible_distribution_major_version' from source: facts 32134 1727204467.33599: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204467.33674: variable 'omit' from source: magic vars 32134 1727204467.33693: variable 'omit' from source: magic vars 32134 1727204467.33835: variable 'profile' from source: include params 32134 1727204467.33848: variable 'interface' from source: set_fact 32134 1727204467.33946: variable 'interface' from source: set_fact 32134 1727204467.33977: variable 'omit' from source: magic vars 32134 1727204467.34039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204467.34092: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204467.34194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204467.34198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204467.34201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204467.34233: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204467.34245: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204467.34256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204467.34402: Set connection var ansible_timeout to 10 32134 1727204467.34434: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204467.34447: Set connection var ansible_connection to ssh 32134 1727204467.34597: Set connection var ansible_shell_type to sh 32134 1727204467.34601: Set connection var ansible_shell_executable to /bin/sh 32134 1727204467.34604: Set connection var ansible_pipelining to False 32134 1727204467.34607: variable 'ansible_shell_executable' from source: unknown 32134 1727204467.34609: variable 'ansible_connection' from source: unknown 32134 1727204467.34615: variable 'ansible_module_compression' from source: unknown 32134 1727204467.34618: variable 'ansible_shell_type' from source: unknown 32134 1727204467.34620: variable 'ansible_shell_executable' from source: unknown 32134 1727204467.34622: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204467.34625: variable 'ansible_pipelining' from source: unknown 32134 1727204467.34628: variable 'ansible_timeout' from source: unknown 32134 1727204467.34630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204467.34762: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204467.34784: variable 'omit' from source: magic vars 32134 1727204467.34801: starting attempt loop 32134 1727204467.34813: running the handler 32134 1727204467.34976: variable 'lsr_net_profile_exists' from source: set_fact 32134 1727204467.34992: Evaluated conditional (not lsr_net_profile_exists): True 32134 1727204467.35006: handler run complete 32134 1727204467.35033: attempt loop complete, returning result 32134 1727204467.35042: _execute() done 32134 1727204467.35051: dumping result to json 32134 1727204467.35073: done dumping result, returning 32134 1727204467.35077: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'ethtest0' [12b410aa-8751-753f-5162-000000000613] 32134 1727204467.35094: sending task result for task 12b410aa-8751-753f-5162-000000000613 32134 1727204467.35259: done sending task result for task 12b410aa-8751-753f-5162-000000000613 32134 1727204467.35263: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 32134 1727204467.35349: no more pending results, returning what we have 32134 1727204467.35354: results queue empty 32134 1727204467.35355: checking for any_errors_fatal 32134 1727204467.35365: done checking for any_errors_fatal 32134 1727204467.35366: checking for max_fail_percentage 32134 1727204467.35368: done checking for max_fail_percentage 32134 1727204467.35370: checking to see if all hosts have failed and the running result is not ok 32134 1727204467.35371: done checking to see if all hosts have failed 32134 1727204467.35372: getting the remaining hosts for this loop 32134 1727204467.35374: done getting the remaining hosts for this loop 32134 1727204467.35379: getting the next task for host managed-node2 32134 1727204467.35507: done getting next task for host managed-node2 32134 1727204467.35519: ^ task is: TASK: Include the task 'assert_device_absent.yml' 32134 1727204467.35522: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204467.35528: getting variables 32134 1727204467.35530: in VariableManager get_vars() 32134 1727204467.35902: Calling all_inventory to load vars for managed-node2 32134 1727204467.35906: Calling groups_inventory to load vars for managed-node2 32134 1727204467.35914: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204467.35927: Calling all_plugins_play to load vars for managed-node2 32134 1727204467.35931: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204467.35936: Calling groups_plugins_play to load vars for managed-node2 32134 1727204467.37565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204467.40828: done with get_vars() 32134 1727204467.40865: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:89 Tuesday 24 September 2024 15:01:07 -0400 (0:00:00.089) 0:00:41.813 ***** 32134 1727204467.40980: entering _queue_task() for managed-node2/include_tasks 32134 1727204467.41350: worker is 1 (out of 1 available) 32134 1727204467.41364: exiting _queue_task() for managed-node2/include_tasks 32134 1727204467.41378: done queuing things up, now waiting for results queue to drain 32134 1727204467.41380: waiting for pending results... 32134 1727204467.41811: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_absent.yml' 32134 1727204467.41824: in run() - task 12b410aa-8751-753f-5162-00000000009e 32134 1727204467.41846: variable 'ansible_search_path' from source: unknown 32134 1727204467.41895: calling self._execute() 32134 1727204467.42011: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204467.42026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204467.42054: variable 'omit' from source: magic vars 32134 1727204467.42531: variable 'ansible_distribution_major_version' from source: facts 32134 1727204467.42550: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204467.42563: _execute() done 32134 1727204467.42572: dumping result to json 32134 1727204467.42591: done dumping result, returning 32134 1727204467.42701: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_absent.yml' [12b410aa-8751-753f-5162-00000000009e] 32134 1727204467.42704: sending task result for task 12b410aa-8751-753f-5162-00000000009e 32134 1727204467.42788: done sending task result for task 12b410aa-8751-753f-5162-00000000009e 32134 1727204467.42809: WORKER PROCESS EXITING 32134 1727204467.42842: no more pending results, returning what we have 32134 1727204467.42847: in VariableManager get_vars() 32134 1727204467.42886: Calling all_inventory to load vars for managed-node2 32134 1727204467.42891: Calling groups_inventory to load vars for managed-node2 32134 1727204467.42895: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204467.42915: Calling all_plugins_play to load vars for managed-node2 32134 1727204467.42919: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204467.42923: Calling groups_plugins_play to load vars for managed-node2 32134 1727204467.44305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204467.46611: done with get_vars() 32134 1727204467.46646: variable 'ansible_search_path' from source: unknown 32134 1727204467.46664: we have included files to process 32134 1727204467.46665: generating all_blocks data 32134 1727204467.46667: done generating all_blocks data 32134 1727204467.46674: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 32134 1727204467.46676: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 32134 1727204467.46679: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 32134 1727204467.46872: in VariableManager get_vars() 32134 1727204467.46894: done with get_vars() 32134 1727204467.47028: done processing included file 32134 1727204467.47031: iterating over new_blocks loaded from include file 32134 1727204467.47033: in VariableManager get_vars() 32134 1727204467.47047: done with get_vars() 32134 1727204467.47049: filtering new block on tags 32134 1727204467.47074: done filtering new block on tags 32134 1727204467.47077: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 32134 1727204467.47083: extending task lists for all hosts with included blocks 32134 1727204467.47393: done extending task lists 32134 1727204467.47395: done processing included files 32134 1727204467.47396: results queue empty 32134 1727204467.47397: checking for any_errors_fatal 32134 1727204467.47401: done checking for any_errors_fatal 32134 1727204467.47402: checking for max_fail_percentage 32134 1727204467.47404: done checking for max_fail_percentage 32134 1727204467.47405: checking to see if all hosts have failed and the running result is not ok 32134 1727204467.47406: done checking to see if all hosts have failed 32134 1727204467.47407: getting the remaining hosts for this loop 32134 1727204467.47408: done getting the remaining hosts for this loop 32134 1727204467.47411: getting the next task for host managed-node2 32134 1727204467.47415: done getting next task for host managed-node2 32134 1727204467.47418: ^ task is: TASK: Include the task 'get_interface_stat.yml' 32134 1727204467.47421: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204467.47424: getting variables 32134 1727204467.47425: in VariableManager get_vars() 32134 1727204467.47436: Calling all_inventory to load vars for managed-node2 32134 1727204467.47439: Calling groups_inventory to load vars for managed-node2 32134 1727204467.47442: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204467.47448: Calling all_plugins_play to load vars for managed-node2 32134 1727204467.47452: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204467.47455: Calling groups_plugins_play to load vars for managed-node2 32134 1727204467.49703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204467.52729: done with get_vars() 32134 1727204467.52770: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:01:07 -0400 (0:00:00.119) 0:00:41.932 ***** 32134 1727204467.52888: entering _queue_task() for managed-node2/include_tasks 32134 1727204467.53269: worker is 1 (out of 1 available) 32134 1727204467.53284: exiting _queue_task() for managed-node2/include_tasks 32134 1727204467.53297: done queuing things up, now waiting for results queue to drain 32134 1727204467.53299: waiting for pending results... 32134 1727204467.53621: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 32134 1727204467.53699: in run() - task 12b410aa-8751-753f-5162-000000000664 32134 1727204467.53724: variable 'ansible_search_path' from source: unknown 32134 1727204467.53733: variable 'ansible_search_path' from source: unknown 32134 1727204467.53775: calling self._execute() 32134 1727204467.53887: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204467.53904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204467.53931: variable 'omit' from source: magic vars 32134 1727204467.54382: variable 'ansible_distribution_major_version' from source: facts 32134 1727204467.54474: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204467.54477: _execute() done 32134 1727204467.54480: dumping result to json 32134 1727204467.54482: done dumping result, returning 32134 1727204467.54485: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-753f-5162-000000000664] 32134 1727204467.54487: sending task result for task 12b410aa-8751-753f-5162-000000000664 32134 1727204467.54560: done sending task result for task 12b410aa-8751-753f-5162-000000000664 32134 1727204467.54563: WORKER PROCESS EXITING 32134 1727204467.54611: no more pending results, returning what we have 32134 1727204467.54617: in VariableManager get_vars() 32134 1727204467.54656: Calling all_inventory to load vars for managed-node2 32134 1727204467.54659: Calling groups_inventory to load vars for managed-node2 32134 1727204467.54664: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204467.54682: Calling all_plugins_play to load vars for managed-node2 32134 1727204467.54686: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204467.54691: Calling groups_plugins_play to load vars for managed-node2 32134 1727204467.57174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204467.60248: done with get_vars() 32134 1727204467.60280: variable 'ansible_search_path' from source: unknown 32134 1727204467.60282: variable 'ansible_search_path' from source: unknown 32134 1727204467.60332: we have included files to process 32134 1727204467.60333: generating all_blocks data 32134 1727204467.60336: done generating all_blocks data 32134 1727204467.60337: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32134 1727204467.60338: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32134 1727204467.60341: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32134 1727204467.60574: done processing included file 32134 1727204467.60577: iterating over new_blocks loaded from include file 32134 1727204467.60579: in VariableManager get_vars() 32134 1727204467.60595: done with get_vars() 32134 1727204467.60597: filtering new block on tags 32134 1727204467.60618: done filtering new block on tags 32134 1727204467.60621: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 32134 1727204467.60627: extending task lists for all hosts with included blocks 32134 1727204467.60762: done extending task lists 32134 1727204467.60764: done processing included files 32134 1727204467.60765: results queue empty 32134 1727204467.60766: checking for any_errors_fatal 32134 1727204467.60770: done checking for any_errors_fatal 32134 1727204467.60771: checking for max_fail_percentage 32134 1727204467.60772: done checking for max_fail_percentage 32134 1727204467.60773: checking to see if all hosts have failed and the running result is not ok 32134 1727204467.60774: done checking to see if all hosts have failed 32134 1727204467.60775: getting the remaining hosts for this loop 32134 1727204467.60776: done getting the remaining hosts for this loop 32134 1727204467.60780: getting the next task for host managed-node2 32134 1727204467.60784: done getting next task for host managed-node2 32134 1727204467.60787: ^ task is: TASK: Get stat for interface {{ interface }} 32134 1727204467.60792: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204467.60795: getting variables 32134 1727204467.60796: in VariableManager get_vars() 32134 1727204467.60807: Calling all_inventory to load vars for managed-node2 32134 1727204467.60809: Calling groups_inventory to load vars for managed-node2 32134 1727204467.60812: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204467.60819: Calling all_plugins_play to load vars for managed-node2 32134 1727204467.60822: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204467.60826: Calling groups_plugins_play to load vars for managed-node2 32134 1727204467.62973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204467.65977: done with get_vars() 32134 1727204467.66046: done getting variables 32134 1727204467.66250: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:01:07 -0400 (0:00:00.133) 0:00:42.066 ***** 32134 1727204467.66288: entering _queue_task() for managed-node2/stat 32134 1727204467.66678: worker is 1 (out of 1 available) 32134 1727204467.66796: exiting _queue_task() for managed-node2/stat 32134 1727204467.66807: done queuing things up, now waiting for results queue to drain 32134 1727204467.66809: waiting for pending results... 32134 1727204467.67112: running TaskExecutor() for managed-node2/TASK: Get stat for interface ethtest0 32134 1727204467.67178: in run() - task 12b410aa-8751-753f-5162-000000000687 32134 1727204467.67207: variable 'ansible_search_path' from source: unknown 32134 1727204467.67217: variable 'ansible_search_path' from source: unknown 32134 1727204467.67263: calling self._execute() 32134 1727204467.67378: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204467.67425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204467.67429: variable 'omit' from source: magic vars 32134 1727204467.67873: variable 'ansible_distribution_major_version' from source: facts 32134 1727204467.67896: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204467.67910: variable 'omit' from source: magic vars 32134 1727204467.68076: variable 'omit' from source: magic vars 32134 1727204467.68109: variable 'interface' from source: set_fact 32134 1727204467.68133: variable 'omit' from source: magic vars 32134 1727204467.68194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204467.68247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204467.68278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204467.68316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204467.68336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204467.68375: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204467.68391: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204467.68405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204467.68552: Set connection var ansible_timeout to 10 32134 1727204467.68575: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204467.68583: Set connection var ansible_connection to ssh 32134 1727204467.68641: Set connection var ansible_shell_type to sh 32134 1727204467.68645: Set connection var ansible_shell_executable to /bin/sh 32134 1727204467.68647: Set connection var ansible_pipelining to False 32134 1727204467.68660: variable 'ansible_shell_executable' from source: unknown 32134 1727204467.68669: variable 'ansible_connection' from source: unknown 32134 1727204467.68676: variable 'ansible_module_compression' from source: unknown 32134 1727204467.68682: variable 'ansible_shell_type' from source: unknown 32134 1727204467.68691: variable 'ansible_shell_executable' from source: unknown 32134 1727204467.68704: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204467.68749: variable 'ansible_pipelining' from source: unknown 32134 1727204467.68756: variable 'ansible_timeout' from source: unknown 32134 1727204467.68758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204467.69041: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 32134 1727204467.69062: variable 'omit' from source: magic vars 32134 1727204467.69078: starting attempt loop 32134 1727204467.69086: running the handler 32134 1727204467.69184: _low_level_execute_command(): starting 32134 1727204467.69187: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204467.69978: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204467.70032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204467.70051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204467.70082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204467.70169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204467.71933: stdout chunk (state=3): >>>/root <<< 32134 1727204467.72114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204467.72140: stderr chunk (state=3): >>><<< 32134 1727204467.72157: stdout chunk (state=3): >>><<< 32134 1727204467.72194: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204467.72223: _low_level_execute_command(): starting 32134 1727204467.72237: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204467.7220213-33952-216819941948792 `" && echo ansible-tmp-1727204467.7220213-33952-216819941948792="` echo /root/.ansible/tmp/ansible-tmp-1727204467.7220213-33952-216819941948792 `" ) && sleep 0' 32134 1727204467.72895: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204467.72911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204467.72928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204467.72953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204467.73011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204467.73095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204467.73112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204467.73135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204467.73212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204467.75284: stdout chunk (state=3): >>>ansible-tmp-1727204467.7220213-33952-216819941948792=/root/.ansible/tmp/ansible-tmp-1727204467.7220213-33952-216819941948792 <<< 32134 1727204467.75481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204467.75506: stderr chunk (state=3): >>><<< 32134 1727204467.75520: stdout chunk (state=3): >>><<< 32134 1727204467.75549: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204467.7220213-33952-216819941948792=/root/.ansible/tmp/ansible-tmp-1727204467.7220213-33952-216819941948792 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204467.75631: variable 'ansible_module_compression' from source: unknown 32134 1727204467.75702: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 32134 1727204467.75897: variable 'ansible_facts' from source: unknown 32134 1727204467.75900: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204467.7220213-33952-216819941948792/AnsiballZ_stat.py 32134 1727204467.76043: Sending initial data 32134 1727204467.76053: Sent initial data (153 bytes) 32134 1727204467.76767: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204467.76783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204467.76810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204467.76917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204467.76950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204467.76969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204467.76998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204467.77083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204467.78804: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204467.78870: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204467.78933: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpiw9lr8n9 /root/.ansible/tmp/ansible-tmp-1727204467.7220213-33952-216819941948792/AnsiballZ_stat.py <<< 32134 1727204467.78937: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204467.7220213-33952-216819941948792/AnsiballZ_stat.py" <<< 32134 1727204467.78970: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpiw9lr8n9" to remote "/root/.ansible/tmp/ansible-tmp-1727204467.7220213-33952-216819941948792/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204467.7220213-33952-216819941948792/AnsiballZ_stat.py" <<< 32134 1727204467.80091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204467.80135: stderr chunk (state=3): >>><<< 32134 1727204467.80146: stdout chunk (state=3): >>><<< 32134 1727204467.80175: done transferring module to remote 32134 1727204467.80204: _low_level_execute_command(): starting 32134 1727204467.80217: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204467.7220213-33952-216819941948792/ /root/.ansible/tmp/ansible-tmp-1727204467.7220213-33952-216819941948792/AnsiballZ_stat.py && sleep 0' 32134 1727204467.80878: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204467.80896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204467.80916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204467.80938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204467.80963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 32134 1727204467.81076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204467.81098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204467.81176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204467.83227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204467.83250: stdout chunk (state=3): >>><<< 32134 1727204467.83386: stderr chunk (state=3): >>><<< 32134 1727204467.83393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204467.83396: _low_level_execute_command(): starting 32134 1727204467.83398: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204467.7220213-33952-216819941948792/AnsiballZ_stat.py && sleep 0' 32134 1727204467.84155: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204467.84159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204467.84180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204467.84275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204468.02046: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 32134 1727204468.03708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204468.03715: stdout chunk (state=3): >>><<< 32134 1727204468.03718: stderr chunk (state=3): >>><<< 32134 1727204468.03892: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204468.03896: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204467.7220213-33952-216819941948792/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204468.03898: _low_level_execute_command(): starting 32134 1727204468.03901: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204467.7220213-33952-216819941948792/ > /dev/null 2>&1 && sleep 0' 32134 1727204468.04508: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204468.04578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204468.04640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204468.04657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204468.04695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204468.04780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204468.06855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204468.06866: stdout chunk (state=3): >>><<< 32134 1727204468.06881: stderr chunk (state=3): >>><<< 32134 1727204468.06915: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204468.07095: handler run complete 32134 1727204468.07098: attempt loop complete, returning result 32134 1727204468.07101: _execute() done 32134 1727204468.07103: dumping result to json 32134 1727204468.07105: done dumping result, returning 32134 1727204468.07107: done running TaskExecutor() for managed-node2/TASK: Get stat for interface ethtest0 [12b410aa-8751-753f-5162-000000000687] 32134 1727204468.07110: sending task result for task 12b410aa-8751-753f-5162-000000000687 32134 1727204468.07186: done sending task result for task 12b410aa-8751-753f-5162-000000000687 32134 1727204468.07191: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 32134 1727204468.07274: no more pending results, returning what we have 32134 1727204468.07278: results queue empty 32134 1727204468.07280: checking for any_errors_fatal 32134 1727204468.07282: done checking for any_errors_fatal 32134 1727204468.07283: checking for max_fail_percentage 32134 1727204468.07285: done checking for max_fail_percentage 32134 1727204468.07286: checking to see if all hosts have failed and the running result is not ok 32134 1727204468.07288: done checking to see if all hosts have failed 32134 1727204468.07291: getting the remaining hosts for this loop 32134 1727204468.07293: done getting the remaining hosts for this loop 32134 1727204468.07298: getting the next task for host managed-node2 32134 1727204468.07308: done getting next task for host managed-node2 32134 1727204468.07313: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 32134 1727204468.07316: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204468.07321: getting variables 32134 1727204468.07323: in VariableManager get_vars() 32134 1727204468.07355: Calling all_inventory to load vars for managed-node2 32134 1727204468.07358: Calling groups_inventory to load vars for managed-node2 32134 1727204468.07362: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204468.07375: Calling all_plugins_play to load vars for managed-node2 32134 1727204468.07379: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204468.07382: Calling groups_plugins_play to load vars for managed-node2 32134 1727204468.10068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204468.13105: done with get_vars() 32134 1727204468.13147: done getting variables 32134 1727204468.13226: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 32134 1727204468.13369: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:01:08 -0400 (0:00:00.471) 0:00:42.538 ***** 32134 1727204468.13409: entering _queue_task() for managed-node2/assert 32134 1727204468.13795: worker is 1 (out of 1 available) 32134 1727204468.13809: exiting _queue_task() for managed-node2/assert 32134 1727204468.13820: done queuing things up, now waiting for results queue to drain 32134 1727204468.13823: waiting for pending results... 32134 1727204468.14215: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'ethtest0' 32134 1727204468.14226: in run() - task 12b410aa-8751-753f-5162-000000000665 32134 1727204468.14311: variable 'ansible_search_path' from source: unknown 32134 1727204468.14314: variable 'ansible_search_path' from source: unknown 32134 1727204468.14317: calling self._execute() 32134 1727204468.14436: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204468.14450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204468.14467: variable 'omit' from source: magic vars 32134 1727204468.14939: variable 'ansible_distribution_major_version' from source: facts 32134 1727204468.14964: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204468.14978: variable 'omit' from source: magic vars 32134 1727204468.15039: variable 'omit' from source: magic vars 32134 1727204468.15179: variable 'interface' from source: set_fact 32134 1727204468.15203: variable 'omit' from source: magic vars 32134 1727204468.15288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204468.15310: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204468.15340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204468.15367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204468.15386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204468.15594: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204468.15598: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204468.15600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204468.15603: Set connection var ansible_timeout to 10 32134 1727204468.15605: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204468.15607: Set connection var ansible_connection to ssh 32134 1727204468.15609: Set connection var ansible_shell_type to sh 32134 1727204468.15622: Set connection var ansible_shell_executable to /bin/sh 32134 1727204468.15635: Set connection var ansible_pipelining to False 32134 1727204468.15667: variable 'ansible_shell_executable' from source: unknown 32134 1727204468.15676: variable 'ansible_connection' from source: unknown 32134 1727204468.15685: variable 'ansible_module_compression' from source: unknown 32134 1727204468.15695: variable 'ansible_shell_type' from source: unknown 32134 1727204468.15702: variable 'ansible_shell_executable' from source: unknown 32134 1727204468.15710: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204468.15723: variable 'ansible_pipelining' from source: unknown 32134 1727204468.15731: variable 'ansible_timeout' from source: unknown 32134 1727204468.15740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204468.15920: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204468.15946: variable 'omit' from source: magic vars 32134 1727204468.15959: starting attempt loop 32134 1727204468.15968: running the handler 32134 1727204468.16164: variable 'interface_stat' from source: set_fact 32134 1727204468.16181: Evaluated conditional (not interface_stat.stat.exists): True 32134 1727204468.16198: handler run complete 32134 1727204468.16224: attempt loop complete, returning result 32134 1727204468.16233: _execute() done 32134 1727204468.16242: dumping result to json 32134 1727204468.16268: done dumping result, returning 32134 1727204468.16271: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'ethtest0' [12b410aa-8751-753f-5162-000000000665] 32134 1727204468.16276: sending task result for task 12b410aa-8751-753f-5162-000000000665 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 32134 1727204468.16558: no more pending results, returning what we have 32134 1727204468.16563: results queue empty 32134 1727204468.16565: checking for any_errors_fatal 32134 1727204468.16577: done checking for any_errors_fatal 32134 1727204468.16579: checking for max_fail_percentage 32134 1727204468.16581: done checking for max_fail_percentage 32134 1727204468.16582: checking to see if all hosts have failed and the running result is not ok 32134 1727204468.16583: done checking to see if all hosts have failed 32134 1727204468.16584: getting the remaining hosts for this loop 32134 1727204468.16586: done getting the remaining hosts for this loop 32134 1727204468.16593: getting the next task for host managed-node2 32134 1727204468.16603: done getting next task for host managed-node2 32134 1727204468.16608: ^ task is: TASK: Verify network state restored to default 32134 1727204468.16611: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204468.16615: getting variables 32134 1727204468.16617: in VariableManager get_vars() 32134 1727204468.16652: Calling all_inventory to load vars for managed-node2 32134 1727204468.16656: Calling groups_inventory to load vars for managed-node2 32134 1727204468.16662: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204468.16676: Calling all_plugins_play to load vars for managed-node2 32134 1727204468.16680: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204468.16684: Calling groups_plugins_play to load vars for managed-node2 32134 1727204468.17606: done sending task result for task 12b410aa-8751-753f-5162-000000000665 32134 1727204468.17609: WORKER PROCESS EXITING 32134 1727204468.19077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204468.22108: done with get_vars() 32134 1727204468.22144: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:91 Tuesday 24 September 2024 15:01:08 -0400 (0:00:00.088) 0:00:42.626 ***** 32134 1727204468.22255: entering _queue_task() for managed-node2/include_tasks 32134 1727204468.22623: worker is 1 (out of 1 available) 32134 1727204468.22638: exiting _queue_task() for managed-node2/include_tasks 32134 1727204468.22651: done queuing things up, now waiting for results queue to drain 32134 1727204468.22652: waiting for pending results... 32134 1727204468.22935: running TaskExecutor() for managed-node2/TASK: Verify network state restored to default 32134 1727204468.23049: in run() - task 12b410aa-8751-753f-5162-00000000009f 32134 1727204468.23071: variable 'ansible_search_path' from source: unknown 32134 1727204468.23121: calling self._execute() 32134 1727204468.23241: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204468.23255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204468.23271: variable 'omit' from source: magic vars 32134 1727204468.23714: variable 'ansible_distribution_major_version' from source: facts 32134 1727204468.23732: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204468.23745: _execute() done 32134 1727204468.23754: dumping result to json 32134 1727204468.23767: done dumping result, returning 32134 1727204468.23777: done running TaskExecutor() for managed-node2/TASK: Verify network state restored to default [12b410aa-8751-753f-5162-00000000009f] 32134 1727204468.23788: sending task result for task 12b410aa-8751-753f-5162-00000000009f 32134 1727204468.23926: no more pending results, returning what we have 32134 1727204468.23932: in VariableManager get_vars() 32134 1727204468.23970: Calling all_inventory to load vars for managed-node2 32134 1727204468.23974: Calling groups_inventory to load vars for managed-node2 32134 1727204468.23978: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204468.23997: Calling all_plugins_play to load vars for managed-node2 32134 1727204468.24001: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204468.24005: Calling groups_plugins_play to load vars for managed-node2 32134 1727204468.24707: done sending task result for task 12b410aa-8751-753f-5162-00000000009f 32134 1727204468.24711: WORKER PROCESS EXITING 32134 1727204468.26414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204468.29321: done with get_vars() 32134 1727204468.29356: variable 'ansible_search_path' from source: unknown 32134 1727204468.29373: we have included files to process 32134 1727204468.29374: generating all_blocks data 32134 1727204468.29376: done generating all_blocks data 32134 1727204468.29383: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 32134 1727204468.29384: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 32134 1727204468.29387: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 32134 1727204468.29883: done processing included file 32134 1727204468.29886: iterating over new_blocks loaded from include file 32134 1727204468.29887: in VariableManager get_vars() 32134 1727204468.29902: done with get_vars() 32134 1727204468.29904: filtering new block on tags 32134 1727204468.29923: done filtering new block on tags 32134 1727204468.29926: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node2 32134 1727204468.29931: extending task lists for all hosts with included blocks 32134 1727204468.30365: done extending task lists 32134 1727204468.30367: done processing included files 32134 1727204468.30368: results queue empty 32134 1727204468.30369: checking for any_errors_fatal 32134 1727204468.30373: done checking for any_errors_fatal 32134 1727204468.30374: checking for max_fail_percentage 32134 1727204468.30376: done checking for max_fail_percentage 32134 1727204468.30377: checking to see if all hosts have failed and the running result is not ok 32134 1727204468.30378: done checking to see if all hosts have failed 32134 1727204468.30379: getting the remaining hosts for this loop 32134 1727204468.30380: done getting the remaining hosts for this loop 32134 1727204468.30384: getting the next task for host managed-node2 32134 1727204468.30388: done getting next task for host managed-node2 32134 1727204468.30393: ^ task is: TASK: Check routes and DNS 32134 1727204468.30396: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204468.30398: getting variables 32134 1727204468.30400: in VariableManager get_vars() 32134 1727204468.30410: Calling all_inventory to load vars for managed-node2 32134 1727204468.30413: Calling groups_inventory to load vars for managed-node2 32134 1727204468.30416: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204468.30422: Calling all_plugins_play to load vars for managed-node2 32134 1727204468.30426: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204468.30429: Calling groups_plugins_play to load vars for managed-node2 32134 1727204468.37184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204468.40350: done with get_vars() 32134 1727204468.40395: done getting variables 32134 1727204468.40455: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 15:01:08 -0400 (0:00:00.182) 0:00:42.808 ***** 32134 1727204468.40488: entering _queue_task() for managed-node2/shell 32134 1727204468.40870: worker is 1 (out of 1 available) 32134 1727204468.40885: exiting _queue_task() for managed-node2/shell 32134 1727204468.41104: done queuing things up, now waiting for results queue to drain 32134 1727204468.41107: waiting for pending results... 32134 1727204468.41217: running TaskExecutor() for managed-node2/TASK: Check routes and DNS 32134 1727204468.41370: in run() - task 12b410aa-8751-753f-5162-00000000069f 32134 1727204468.41394: variable 'ansible_search_path' from source: unknown 32134 1727204468.41403: variable 'ansible_search_path' from source: unknown 32134 1727204468.41449: calling self._execute() 32134 1727204468.41573: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204468.41665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204468.41669: variable 'omit' from source: magic vars 32134 1727204468.42075: variable 'ansible_distribution_major_version' from source: facts 32134 1727204468.42100: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204468.42116: variable 'omit' from source: magic vars 32134 1727204468.42179: variable 'omit' from source: magic vars 32134 1727204468.42239: variable 'omit' from source: magic vars 32134 1727204468.42288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32134 1727204468.42345: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32134 1727204468.42378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32134 1727204468.42408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204468.42432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32134 1727204468.42596: variable 'inventory_hostname' from source: host vars for 'managed-node2' 32134 1727204468.42600: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204468.42603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204468.42652: Set connection var ansible_timeout to 10 32134 1727204468.42676: Set connection var ansible_module_compression to ZIP_DEFLATED 32134 1727204468.42685: Set connection var ansible_connection to ssh 32134 1727204468.42696: Set connection var ansible_shell_type to sh 32134 1727204468.42717: Set connection var ansible_shell_executable to /bin/sh 32134 1727204468.42897: Set connection var ansible_pipelining to False 32134 1727204468.42931: variable 'ansible_shell_executable' from source: unknown 32134 1727204468.42942: variable 'ansible_connection' from source: unknown 32134 1727204468.42950: variable 'ansible_module_compression' from source: unknown 32134 1727204468.42958: variable 'ansible_shell_type' from source: unknown 32134 1727204468.42966: variable 'ansible_shell_executable' from source: unknown 32134 1727204468.42974: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204468.42983: variable 'ansible_pipelining' from source: unknown 32134 1727204468.42999: variable 'ansible_timeout' from source: unknown 32134 1727204468.43095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204468.43185: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204468.43213: variable 'omit' from source: magic vars 32134 1727204468.43228: starting attempt loop 32134 1727204468.43235: running the handler 32134 1727204468.43250: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 32134 1727204468.43275: _low_level_execute_command(): starting 32134 1727204468.43287: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32134 1727204468.44033: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204468.44049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204468.44068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204468.44099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32134 1727204468.44120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204468.44132: stderr chunk (state=3): >>>debug2: match not found <<< 32134 1727204468.44147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204468.44187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 32134 1727204468.44284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204468.44316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204468.44402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204468.46165: stdout chunk (state=3): >>>/root <<< 32134 1727204468.46276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204468.46371: stderr chunk (state=3): >>><<< 32134 1727204468.46387: stdout chunk (state=3): >>><<< 32134 1727204468.46432: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204468.46456: _low_level_execute_command(): starting 32134 1727204468.46468: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204468.4644036-33972-246386630072808 `" && echo ansible-tmp-1727204468.4644036-33972-246386630072808="` echo /root/.ansible/tmp/ansible-tmp-1727204468.4644036-33972-246386630072808 `" ) && sleep 0' 32134 1727204468.47196: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204468.47223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204468.47268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204468.47286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204468.47322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204468.47387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204468.49393: stdout chunk (state=3): >>>ansible-tmp-1727204468.4644036-33972-246386630072808=/root/.ansible/tmp/ansible-tmp-1727204468.4644036-33972-246386630072808 <<< 32134 1727204468.49606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204468.49609: stderr chunk (state=3): >>><<< 32134 1727204468.49615: stdout chunk (state=3): >>><<< 32134 1727204468.49796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204468.4644036-33972-246386630072808=/root/.ansible/tmp/ansible-tmp-1727204468.4644036-33972-246386630072808 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204468.49800: variable 'ansible_module_compression' from source: unknown 32134 1727204468.49802: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32134fo5ktx0r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32134 1727204468.49805: variable 'ansible_facts' from source: unknown 32134 1727204468.49952: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204468.4644036-33972-246386630072808/AnsiballZ_command.py 32134 1727204468.50170: Sending initial data 32134 1727204468.50176: Sent initial data (156 bytes) 32134 1727204468.51065: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204468.51133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204468.51185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204468.51240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204468.51281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204468.52961: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 32134 1727204468.52986: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32134 1727204468.53037: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32134 1727204468.53097: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpn412tkx_ /root/.ansible/tmp/ansible-tmp-1727204468.4644036-33972-246386630072808/AnsiballZ_command.py <<< 32134 1727204468.53101: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204468.4644036-33972-246386630072808/AnsiballZ_command.py" <<< 32134 1727204468.53148: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-32134fo5ktx0r/tmpn412tkx_" to remote "/root/.ansible/tmp/ansible-tmp-1727204468.4644036-33972-246386630072808/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204468.4644036-33972-246386630072808/AnsiballZ_command.py" <<< 32134 1727204468.54377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204468.54439: stderr chunk (state=3): >>><<< 32134 1727204468.54451: stdout chunk (state=3): >>><<< 32134 1727204468.54487: done transferring module to remote 32134 1727204468.54517: _low_level_execute_command(): starting 32134 1727204468.54530: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204468.4644036-33972-246386630072808/ /root/.ansible/tmp/ansible-tmp-1727204468.4644036-33972-246386630072808/AnsiballZ_command.py && sleep 0' 32134 1727204468.55197: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204468.55218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32134 1727204468.55270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32134 1727204468.55543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 32134 1727204468.55550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204468.55609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204468.55656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204468.55672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204468.55839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204468.57895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204468.57899: stdout chunk (state=3): >>><<< 32134 1727204468.57903: stderr chunk (state=3): >>><<< 32134 1727204468.57909: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204468.57915: _low_level_execute_command(): starting 32134 1727204468.57918: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204468.4644036-33972-246386630072808/AnsiballZ_command.py && sleep 0' 32134 1727204468.58566: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 32134 1727204468.58584: stderr chunk (state=3): >>>debug2: match found <<< 32134 1727204468.58608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204468.58703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204468.58728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204468.58780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204468.58826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204468.76932: stdout chunk (state=3): >>> <<< 32134 1727204468.76971: stdout chunk (state=3): >>>{"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:02:03:51:a3:4b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3034sec preferred_lft 3034sec\n inet6 fe80::4a44:1e77:128f:34e8/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:01:08.759176", "end": "2024-09-24 15:01:08.768270", "delta": "0:00:00.009094", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32134 1727204468.79121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 32134 1727204468.79125: stderr chunk (state=3): >>><<< 32134 1727204468.79128: stdout chunk (state=3): >>><<< 32134 1727204468.79131: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:02:03:51:a3:4b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3034sec preferred_lft 3034sec\n inet6 fe80::4a44:1e77:128f:34e8/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:01:08.759176", "end": "2024-09-24 15:01:08.768270", "delta": "0:00:00.009094", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 32134 1727204468.79192: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204468.4644036-33972-246386630072808/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32134 1727204468.79209: _low_level_execute_command(): starting 32134 1727204468.79227: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204468.4644036-33972-246386630072808/ > /dev/null 2>&1 && sleep 0' 32134 1727204468.79935: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 32134 1727204468.80005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32134 1727204468.80080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 32134 1727204468.80109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32134 1727204468.80181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32134 1727204468.80202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32134 1727204468.82245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32134 1727204468.82266: stderr chunk (state=3): >>><<< 32134 1727204468.82279: stdout chunk (state=3): >>><<< 32134 1727204468.82309: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32134 1727204468.82350: handler run complete 32134 1727204468.82376: Evaluated conditional (False): False 32134 1727204468.82397: attempt loop complete, returning result 32134 1727204468.82456: _execute() done 32134 1727204468.82459: dumping result to json 32134 1727204468.82462: done dumping result, returning 32134 1727204468.82465: done running TaskExecutor() for managed-node2/TASK: Check routes and DNS [12b410aa-8751-753f-5162-00000000069f] 32134 1727204468.82467: sending task result for task 12b410aa-8751-753f-5162-00000000069f 32134 1727204468.82809: done sending task result for task 12b410aa-8751-753f-5162-00000000069f 32134 1727204468.82815: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009094", "end": "2024-09-24 15:01:08.768270", "rc": 0, "start": "2024-09-24 15:01:08.759176" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:02:03:51:a3:4b brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3034sec preferred_lft 3034sec inet6 fe80::4a44:1e77:128f:34e8/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 32134 1727204468.82915: no more pending results, returning what we have 32134 1727204468.82919: results queue empty 32134 1727204468.82920: checking for any_errors_fatal 32134 1727204468.82922: done checking for any_errors_fatal 32134 1727204468.82923: checking for max_fail_percentage 32134 1727204468.82926: done checking for max_fail_percentage 32134 1727204468.82928: checking to see if all hosts have failed and the running result is not ok 32134 1727204468.82929: done checking to see if all hosts have failed 32134 1727204468.82930: getting the remaining hosts for this loop 32134 1727204468.82932: done getting the remaining hosts for this loop 32134 1727204468.82937: getting the next task for host managed-node2 32134 1727204468.82943: done getting next task for host managed-node2 32134 1727204468.82946: ^ task is: TASK: Verify DNS and network connectivity 32134 1727204468.82948: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204468.82957: getting variables 32134 1727204468.82959: in VariableManager get_vars() 32134 1727204468.82996: Calling all_inventory to load vars for managed-node2 32134 1727204468.82999: Calling groups_inventory to load vars for managed-node2 32134 1727204468.83003: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204468.83018: Calling all_plugins_play to load vars for managed-node2 32134 1727204468.83021: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204468.83025: Calling groups_plugins_play to load vars for managed-node2 32134 1727204468.85591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204468.89035: done with get_vars() 32134 1727204468.89078: done getting variables 32134 1727204468.89160: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 15:01:08 -0400 (0:00:00.487) 0:00:43.296 ***** 32134 1727204468.89209: entering _queue_task() for managed-node2/shell 32134 1727204468.89635: worker is 1 (out of 1 available) 32134 1727204468.89647: exiting _queue_task() for managed-node2/shell 32134 1727204468.89661: done queuing things up, now waiting for results queue to drain 32134 1727204468.89664: waiting for pending results... 32134 1727204468.90085: running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity 32134 1727204468.90147: in run() - task 12b410aa-8751-753f-5162-0000000006a0 32134 1727204468.90179: variable 'ansible_search_path' from source: unknown 32134 1727204468.90194: variable 'ansible_search_path' from source: unknown 32134 1727204468.90241: calling self._execute() 32134 1727204468.90399: variable 'ansible_host' from source: host vars for 'managed-node2' 32134 1727204468.90408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 32134 1727204468.90414: variable 'omit' from source: magic vars 32134 1727204468.90907: variable 'ansible_distribution_major_version' from source: facts 32134 1727204468.90946: Evaluated conditional (ansible_distribution_major_version != '6'): True 32134 1727204468.91173: variable 'ansible_facts' from source: unknown 32134 1727204468.92435: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 32134 1727204468.92445: when evaluation is False, skipping this task 32134 1727204468.92454: _execute() done 32134 1727204468.92699: dumping result to json 32134 1727204468.92703: done dumping result, returning 32134 1727204468.92706: done running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity [12b410aa-8751-753f-5162-0000000006a0] 32134 1727204468.92708: sending task result for task 12b410aa-8751-753f-5162-0000000006a0 32134 1727204468.92784: done sending task result for task 12b410aa-8751-753f-5162-0000000006a0 32134 1727204468.92788: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 32134 1727204468.92847: no more pending results, returning what we have 32134 1727204468.92853: results queue empty 32134 1727204468.92854: checking for any_errors_fatal 32134 1727204468.92866: done checking for any_errors_fatal 32134 1727204468.92867: checking for max_fail_percentage 32134 1727204468.92869: done checking for max_fail_percentage 32134 1727204468.92871: checking to see if all hosts have failed and the running result is not ok 32134 1727204468.92874: done checking to see if all hosts have failed 32134 1727204468.92875: getting the remaining hosts for this loop 32134 1727204468.92877: done getting the remaining hosts for this loop 32134 1727204468.92882: getting the next task for host managed-node2 32134 1727204468.92894: done getting next task for host managed-node2 32134 1727204468.92897: ^ task is: TASK: meta (flush_handlers) 32134 1727204468.92899: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204468.92904: getting variables 32134 1727204468.92906: in VariableManager get_vars() 32134 1727204468.92943: Calling all_inventory to load vars for managed-node2 32134 1727204468.92947: Calling groups_inventory to load vars for managed-node2 32134 1727204468.92951: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204468.92968: Calling all_plugins_play to load vars for managed-node2 32134 1727204468.92972: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204468.92977: Calling groups_plugins_play to load vars for managed-node2 32134 1727204468.95533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204468.98622: done with get_vars() 32134 1727204468.98659: done getting variables 32134 1727204468.98722: in VariableManager get_vars() 32134 1727204468.98730: Calling all_inventory to load vars for managed-node2 32134 1727204468.98732: Calling groups_inventory to load vars for managed-node2 32134 1727204468.98734: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204468.98739: Calling all_plugins_play to load vars for managed-node2 32134 1727204468.98740: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204468.98744: Calling groups_plugins_play to load vars for managed-node2 32134 1727204468.99945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204469.02119: done with get_vars() 32134 1727204469.02150: done queuing things up, now waiting for results queue to drain 32134 1727204469.02152: results queue empty 32134 1727204469.02152: checking for any_errors_fatal 32134 1727204469.02154: done checking for any_errors_fatal 32134 1727204469.02155: checking for max_fail_percentage 32134 1727204469.02156: done checking for max_fail_percentage 32134 1727204469.02157: checking to see if all hosts have failed and the running result is not ok 32134 1727204469.02157: done checking to see if all hosts have failed 32134 1727204469.02158: getting the remaining hosts for this loop 32134 1727204469.02159: done getting the remaining hosts for this loop 32134 1727204469.02161: getting the next task for host managed-node2 32134 1727204469.02164: done getting next task for host managed-node2 32134 1727204469.02166: ^ task is: TASK: meta (flush_handlers) 32134 1727204469.02167: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204469.02169: getting variables 32134 1727204469.02170: in VariableManager get_vars() 32134 1727204469.02177: Calling all_inventory to load vars for managed-node2 32134 1727204469.02179: Calling groups_inventory to load vars for managed-node2 32134 1727204469.02181: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204469.02186: Calling all_plugins_play to load vars for managed-node2 32134 1727204469.02188: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204469.02192: Calling groups_plugins_play to load vars for managed-node2 32134 1727204469.03310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204469.05866: done with get_vars() 32134 1727204469.05891: done getting variables 32134 1727204469.05938: in VariableManager get_vars() 32134 1727204469.05946: Calling all_inventory to load vars for managed-node2 32134 1727204469.05949: Calling groups_inventory to load vars for managed-node2 32134 1727204469.05952: Calling all_plugins_inventory to load vars for managed-node2 32134 1727204469.05958: Calling all_plugins_play to load vars for managed-node2 32134 1727204469.05960: Calling groups_plugins_inventory to load vars for managed-node2 32134 1727204469.05962: Calling groups_plugins_play to load vars for managed-node2 32134 1727204469.07320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32134 1727204469.10249: done with get_vars() 32134 1727204469.10299: done queuing things up, now waiting for results queue to drain 32134 1727204469.10302: results queue empty 32134 1727204469.10303: checking for any_errors_fatal 32134 1727204469.10304: done checking for any_errors_fatal 32134 1727204469.10305: checking for max_fail_percentage 32134 1727204469.10307: done checking for max_fail_percentage 32134 1727204469.10308: checking to see if all hosts have failed and the running result is not ok 32134 1727204469.10309: done checking to see if all hosts have failed 32134 1727204469.10312: getting the remaining hosts for this loop 32134 1727204469.10314: done getting the remaining hosts for this loop 32134 1727204469.10325: getting the next task for host managed-node2 32134 1727204469.10329: done getting next task for host managed-node2 32134 1727204469.10330: ^ task is: None 32134 1727204469.10336: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32134 1727204469.10338: done queuing things up, now waiting for results queue to drain 32134 1727204469.10339: results queue empty 32134 1727204469.10340: checking for any_errors_fatal 32134 1727204469.10341: done checking for any_errors_fatal 32134 1727204469.10343: checking for max_fail_percentage 32134 1727204469.10344: done checking for max_fail_percentage 32134 1727204469.10345: checking to see if all hosts have failed and the running result is not ok 32134 1727204469.10346: done checking to see if all hosts have failed 32134 1727204469.10349: getting the next task for host managed-node2 32134 1727204469.10353: done getting next task for host managed-node2 32134 1727204469.10354: ^ task is: None 32134 1727204469.10355: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=74 changed=2 unreachable=0 failed=0 skipped=76 rescued=0 ignored=1 Tuesday 24 September 2024 15:01:09 -0400 (0:00:00.212) 0:00:43.508 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.60s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.33s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.29s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install iproute --------------------------------------------------------- 2.14s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gathering Facts --------------------------------------------------------- 1.89s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:6 Create veth interface ethtest0 ------------------------------------------ 1.66s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 1.36s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.26s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:3 Gathering Facts --------------------------------------------------------- 1.17s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.11s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.05s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.04s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:80 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.02s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.98s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather the minimum subset of ansible_facts required by the network role test --- 0.94s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.86s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Check if system is ostree ----------------------------------------------- 0.78s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.76s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.75s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 32134 1727204469.10465: RUNNING CLEANUP