[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 8299 1726877632.90711: starting run ansible-playbook [core 2.16.11] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Z4Q executable location = /usr/local/bin/ansible-playbook python version = 3.12.1 (main, Feb 21 2024, 14:18:26) [GCC 8.5.0 20210514 (Red Hat 8.5.0-21)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 8299 1726877632.91122: Added group all to inventory 8299 1726877632.91123: Added group ungrouped to inventory 8299 1726877632.91126: Group all now contains ungrouped 8299 1726877632.91128: Examining possible inventory source: /tmp/ad_integration-Ay1/inventory.yml 8299 1726877633.01361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 8299 1726877633.01420: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 8299 1726877633.01437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 8299 1726877633.01492: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 8299 1726877633.01567: Loaded config def from plugin (inventory/script) 8299 1726877633.01570: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 8299 1726877633.01604: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 8299 1726877633.01697: Loaded config def from plugin (inventory/yaml) 8299 1726877633.01699: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 8299 1726877633.01790: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 8299 1726877633.02246: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 8299 1726877633.02249: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 8299 1726877633.02251: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 8299 1726877633.02262: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 8299 1726877633.02266: Loading data from /tmp/ad_integration-Ay1/inventory.yml 8299 1726877633.02312: /tmp/ad_integration-Ay1/inventory.yml was not parsable by auto 8299 1726877633.02357: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 8299 1726877633.02397: Loading data from /tmp/ad_integration-Ay1/inventory.yml 8299 1726877633.02482: group all already in inventory 8299 1726877633.02490: set inventory_file for managed_node1 8299 1726877633.02494: set inventory_dir for managed_node1 8299 1726877633.02495: Added host managed_node1 to inventory 8299 1726877633.02497: Added host managed_node1 to group all 8299 1726877633.02498: set ansible_host for managed_node1 8299 1726877633.02499: set ansible_ssh_extra_args for managed_node1 8299 1726877633.02502: set inventory_file for managed_node2 8299 1726877633.02505: set inventory_dir for managed_node2 8299 1726877633.02507: Added host managed_node2 to inventory 8299 1726877633.02508: Added host managed_node2 to group all 8299 1726877633.02509: set ansible_host for managed_node2 8299 1726877633.02510: set ansible_ssh_extra_args for managed_node2 8299 1726877633.02512: set inventory_file for managed_node3 8299 1726877633.02515: set inventory_dir for managed_node3 8299 1726877633.02515: Added host managed_node3 to inventory 8299 1726877633.02516: Added host managed_node3 to group all 8299 1726877633.02519: set ansible_host for managed_node3 8299 1726877633.02520: set ansible_ssh_extra_args for managed_node3 8299 1726877633.02522: Reconcile groups and hosts in inventory. 8299 1726877633.02526: Group ungrouped now contains managed_node1 8299 1726877633.02528: Group ungrouped now contains managed_node2 8299 1726877633.02529: Group ungrouped now contains managed_node3 8299 1726877633.02597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 8299 1726877633.02700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 8299 1726877633.02730: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 8299 1726877633.02747: Loaded config def from plugin (vars/host_group_vars) 8299 1726877633.02748: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 8299 1726877633.02753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 8299 1726877633.02761: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8299 1726877633.02790: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 8299 1726877633.03020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8299 1726877633.03122: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 8299 1726877633.03163: Loaded config def from plugin (connection/local) 8299 1726877633.03166: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 8299 1726877633.03681: Loaded config def from plugin (connection/paramiko_ssh) 8299 1726877633.03684: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 8299 1726877633.04577: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8299 1726877633.04617: Loaded config def from plugin (connection/psrp) 8299 1726877633.04621: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 8299 1726877633.05285: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8299 1726877633.05308: Loaded config def from plugin (connection/ssh) 8299 1726877633.05310: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 8299 1726877633.06691: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8299 1726877633.06714: Loaded config def from plugin (connection/winrm) 8299 1726877633.06716: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 8299 1726877633.06739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 8299 1726877633.06787: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 8299 1726877633.06825: Loaded config def from plugin (shell/cmd) [WARNING]: Could not match supplied host pattern, ignoring: ad 8299 1726877633.06826: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 8299 1726877633.06842: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 8299 1726877633.06881: Loaded config def from plugin (shell/powershell) 8299 1726877633.06883: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 8299 1726877633.06920: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 8299 1726877633.07020: Loaded config def from plugin (shell/sh) 8299 1726877633.07022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 8299 1726877633.07045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 8299 1726877633.07119: Loaded config def from plugin (become/runas) 8299 1726877633.07120: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 8299 1726877633.07231: Loaded config def from plugin (become/su) 8299 1726877633.07233: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 8299 1726877633.07327: Loaded config def from plugin (become/sudo) 8299 1726877633.07329: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 8299 1726877633.07351: Loading data from /tmp/collections-Z4Q/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml 8299 1726877633.07648: trying /usr/local/lib/python3.12/site-packages/ansible/modules 8299 1726877633.09981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 8299 1726877633.10088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 8299 1726877633.10097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 8299 1726877633.10272: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 8299 1726877633.10417: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 8299 1726877633.10420: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Z4Q/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 8299 1726877633.10455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 8299 1726877633.10482: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8299 1726877633.10601: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 8299 1726877633.10634: Loaded config def from plugin (callback/default) 8299 1726877633.10636: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8299 1726877633.11547: Loaded config def from plugin (callback/junit) 8299 1726877633.11549: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8299 1726877633.11581: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 8299 1726877633.11620: Loaded config def from plugin (callback/minimal) 8299 1726877633.11622: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8299 1726877633.11647: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8299 1726877633.11689: Loaded config def from plugin (callback/tree) 8299 1726877633.11691: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 8299 1726877633.11771: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 8299 1726877633.11773: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Z4Q/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /tmp/collections-Z4Q/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml 8299 1726877633.11790: in VariableManager get_vars() 8299 1726877633.11809: Could not match supplied host pattern, ignoring: ad 8299 1726877633.11820: done with get_vars() 8299 1726877633.11824: in VariableManager get_vars() 8299 1726877633.11830: done with get_vars() 8299 1726877633.11833: variable 'omit' from source: magic vars 8299 1726877633.11860: in VariableManager get_vars() 8299 1726877633.11869: done with get_vars() 8299 1726877633.11886: variable 'omit' from source: magic vars PLAY [Ensure role behaviour with default parameters] *************************** 8299 1726877633.12250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 8299 1726877633.12301: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 8299 1726877633.12323: getting the remaining hosts for this loop 8299 1726877633.12324: done getting the remaining hosts for this loop 8299 1726877633.12326: getting the next task for host managed_node2 8299 1726877633.12329: done getting next task for host managed_node2 8299 1726877633.12331: ^ task is: TASK: meta (flush_handlers) 8299 1726877633.12332: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8299 1726877633.12338: getting variables 8299 1726877633.12339: in VariableManager get_vars() 8299 1726877633.12349: Calling all_inventory to load vars for managed_node2 8299 1726877633.12354: Calling groups_inventory to load vars for managed_node2 8299 1726877633.12358: Calling all_plugins_inventory to load vars for managed_node2 8299 1726877633.12371: Calling all_plugins_play to load vars for managed_node2 8299 1726877633.12383: Calling groups_plugins_inventory to load vars for managed_node2 8299 1726877633.12389: Calling groups_plugins_play to load vars for managed_node2 8299 1726877633.12424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8299 1726877633.12475: done with get_vars() 8299 1726877633.12483: done getting variables 8299 1726877633.12522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 8299 1726877633.12557: in VariableManager get_vars() 8299 1726877633.12564: Calling all_inventory to load vars for managed_node2 8299 1726877633.12566: Calling groups_inventory to load vars for managed_node2 8299 1726877633.12567: Calling all_plugins_inventory to load vars for managed_node2 8299 1726877633.12570: Calling all_plugins_play to load vars for managed_node2 8299 1726877633.12571: Calling groups_plugins_inventory to load vars for managed_node2 8299 1726877633.12573: Calling groups_plugins_play to load vars for managed_node2 8299 1726877633.12589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8299 1726877633.12598: done with get_vars() 8299 1726877633.12608: done queuing things up, now waiting for results queue to drain 8299 1726877633.12610: results queue empty 8299 1726877633.12610: checking for any_errors_fatal 8299 1726877633.12613: done checking for any_errors_fatal 8299 1726877633.12613: checking for max_fail_percentage 8299 1726877633.12614: done checking for max_fail_percentage 8299 1726877633.12614: checking to see if all hosts have failed and the running result is not ok 8299 1726877633.12615: done checking to see if all hosts have failed 8299 1726877633.12615: getting the remaining hosts for this loop 8299 1726877633.12616: done getting the remaining hosts for this loop 8299 1726877633.12618: getting the next task for host managed_node2 8299 1726877633.12621: done getting next task for host managed_node2 8299 1726877633.12622: ^ task is: TASK: Include the role 8299 1726877633.12623: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8299 1726877633.12625: getting variables 8299 1726877633.12625: in VariableManager get_vars() 8299 1726877633.12644: Calling all_inventory to load vars for managed_node2 8299 1726877633.12646: Calling groups_inventory to load vars for managed_node2 8299 1726877633.12647: Calling all_plugins_inventory to load vars for managed_node2 8299 1726877633.12650: Calling all_plugins_play to load vars for managed_node2 8299 1726877633.12652: Calling groups_plugins_inventory to load vars for managed_node2 8299 1726877633.12655: Calling groups_plugins_play to load vars for managed_node2 8299 1726877633.12674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8299 1726877633.12682: done with get_vars() 8299 1726877633.12686: done getting variables TASK [Include the role] ******************************************************** task path: /tmp/collections-Z4Q/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml:11 Friday 20 September 2024 20:13:53 -0400 (0:00:00.010) 0:00:00.010 ****** 8299 1726877633.12734: entering _queue_task() for managed_node2/include_role 8299 1726877633.12736: Creating lock for include_role 8299 1726877633.12937: worker is 1 (out of 1 available) 8299 1726877633.12949: exiting _queue_task() for managed_node2/include_role 8299 1726877633.12960: done queuing things up, now waiting for results queue to drain 8299 1726877633.12962: waiting for pending results... 8312 1726877633.13035: running TaskExecutor() for managed_node2/TASK: Include the role 8312 1726877633.13135: in run() - task 0afffc7c-1039-1dfd-2729-000000000006 8312 1726877633.13151: variable 'ansible_search_path' from source: unknown 8312 1726877633.13180: calling self._execute() 8312 1726877633.13222: variable 'ansible_host' from source: host vars for 'managed_node2' 8312 1726877633.13229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8312 1726877633.13238: variable 'omit' from source: magic vars 8312 1726877633.13306: _execute() done 8312 1726877633.13312: dumping result to json 8312 1726877633.13316: done dumping result, returning 8312 1726877633.13319: done running TaskExecutor() for managed_node2/TASK: Include the role [0afffc7c-1039-1dfd-2729-000000000006] 8312 1726877633.13325: sending task result for task 0afffc7c-1039-1dfd-2729-000000000006 8312 1726877633.13346: done sending task result for task 0afffc7c-1039-1dfd-2729-000000000006 8312 1726877633.13347: WORKER PROCESS EXITING 8299 1726877633.13556: no more pending results, returning what we have 8299 1726877633.13560: in VariableManager get_vars() 8299 1726877633.13590: Calling all_inventory to load vars for managed_node2 8299 1726877633.13592: Calling groups_inventory to load vars for managed_node2 8299 1726877633.13595: Calling all_plugins_inventory to load vars for managed_node2 8299 1726877633.13602: Calling all_plugins_play to load vars for managed_node2 8299 1726877633.13604: Calling groups_plugins_inventory to load vars for managed_node2 8299 1726877633.13605: Calling groups_plugins_play to load vars for managed_node2 8299 1726877633.13653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8299 1726877633.13666: done with get_vars() 8299 1726877633.13670: variable 'ansible_search_path' from source: unknown 8299 1726877633.13720: variable 'omit' from source: magic vars 8299 1726877633.13735: variable 'omit' from source: magic vars 8299 1726877633.13744: variable 'omit' from source: magic vars 8299 1726877633.13747: we have included files to process 8299 1726877633.13748: generating all_blocks data 8299 1726877633.13749: done generating all_blocks data 8299 1726877633.13749: processing included file: fedora.linux_system_roles.ad_integration 8299 1726877633.13767: in VariableManager get_vars() 8299 1726877633.13779: done with get_vars() 8299 1726877633.13831: in VariableManager get_vars() 8299 1726877633.13843: done with get_vars() 8299 1726877633.13886: Loading data from /tmp/collections-Z4Q/ansible_collections/fedora/linux_system_roles/roles/ad_integration/vars/main.yml 8299 1726877633.13978: Loading data from /tmp/collections-Z4Q/ansible_collections/fedora/linux_system_roles/roles/ad_integration/defaults/main.yml 8299 1726877633.14050: Loading data from /tmp/collections-Z4Q/ansible_collections/fedora/linux_system_roles/roles/ad_integration/meta/main.yml 8299 1726877633.14144: Loading data from /tmp/collections-Z4Q/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml 8299 1726877633.17046: in VariableManager get_vars() 8299 1726877633.17074: done with get_vars() 8299 1726877633.19029: Loading data from /tmp/collections-Z4Q/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml 8299 1726877633.19549: iterating over new_blocks loaded from include file 8299 1726877633.19550: in VariableManager get_vars() 8299 1726877633.19560: done with get_vars() 8299 1726877633.19562: filtering new block on tags 8299 1726877633.19596: done filtering new block on tags 8299 1726877633.19598: in VariableManager get_vars() 8299 1726877633.19605: done with get_vars() 8299 1726877633.19606: filtering new block on tags 8299 1726877633.19616: done filtering new block on tags 8299 1726877633.19617: done iterating over new_blocks loaded from include file 8299 1726877633.19617: extending task lists for all hosts with included blocks 8299 1726877633.19648: done extending task lists 8299 1726877633.19649: done processing included files 8299 1726877633.19649: results queue empty 8299 1726877633.19649: checking for any_errors_fatal 8299 1726877633.19651: done checking for any_errors_fatal 8299 1726877633.19651: checking for max_fail_percentage 8299 1726877633.19652: done checking for max_fail_percentage 8299 1726877633.19653: checking to see if all hosts have failed and the running result is not ok 8299 1726877633.19655: done checking to see if all hosts have failed 8299 1726877633.19655: getting the remaining hosts for this loop 8299 1726877633.19656: done getting the remaining hosts for this loop 8299 1726877633.19657: getting the next task for host managed_node2 8299 1726877633.19659: done getting next task for host managed_node2 8299 1726877633.19661: ^ task is: TASK: fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available 8299 1726877633.19662: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8299 1726877633.19674: getting variables 8299 1726877633.19675: in VariableManager get_vars() 8299 1726877633.19687: Calling all_inventory to load vars for managed_node2 8299 1726877633.19689: Calling groups_inventory to load vars for managed_node2 8299 1726877633.19691: Calling all_plugins_inventory to load vars for managed_node2 8299 1726877633.19694: Calling all_plugins_play to load vars for managed_node2 8299 1726877633.19695: Calling groups_plugins_inventory to load vars for managed_node2 8299 1726877633.19697: Calling groups_plugins_play to load vars for managed_node2 8299 1726877633.19736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8299 1726877633.19749: done with get_vars() 8299 1726877633.19756: done getting variables 8299 1726877633.19800: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available] *** task path: /tmp/collections-Z4Q/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:3 Friday 20 September 2024 20:13:53 -0400 (0:00:00.070) 0:00:00.081 ****** 8299 1726877633.19819: entering _queue_task() for managed_node2/fail 8299 1726877633.19820: Creating lock for fail 8299 1726877633.19991: worker is 1 (out of 1 available) 8299 1726877633.20003: exiting _queue_task() for managed_node2/fail 8299 1726877633.20014: done queuing things up, now waiting for results queue to drain 8299 1726877633.20015: waiting for pending results... 8313 1726877633.20114: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available 8313 1726877633.20246: in run() - task 0afffc7c-1039-1dfd-2729-000000000023 8313 1726877633.20266: variable 'ansible_search_path' from source: unknown 8313 1726877633.20271: variable 'ansible_search_path' from source: unknown 8313 1726877633.20298: calling self._execute() 8313 1726877633.20351: variable 'ansible_host' from source: host vars for 'managed_node2' 8313 1726877633.20363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8313 1726877633.20373: variable 'omit' from source: magic vars 8313 1726877633.20751: variable 'ad_integration_realm' from source: role '' defaults 8313 1726877633.20762: Evaluated conditional (not ad_integration_realm): True 8313 1726877633.20767: variable 'omit' from source: magic vars 8313 1726877633.20792: variable 'omit' from source: magic vars 8313 1726877633.20813: variable 'omit' from source: magic vars 8313 1726877633.20839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8313 1726877633.20865: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8313 1726877633.20880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8313 1726877633.20895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8313 1726877633.20909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8313 1726877633.20937: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8313 1726877633.20942: variable 'ansible_host' from source: host vars for 'managed_node2' 8313 1726877633.20945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8313 1726877633.21070: Set connection var ansible_pipelining to False 8313 1726877633.21077: Set connection var ansible_connection to ssh 8313 1726877633.21084: Set connection var ansible_shell_executable to /bin/sh 8313 1726877633.21089: Set connection var ansible_shell_type to sh 8313 1726877633.21096: Set connection var ansible_module_compression to ZIP_DEFLATED 8313 1726877633.21101: Set connection var ansible_timeout to 10 8313 1726877633.21116: variable 'ansible_shell_executable' from source: unknown 8313 1726877633.21119: variable 'ansible_connection' from source: unknown 8313 1726877633.21123: variable 'ansible_module_compression' from source: unknown 8313 1726877633.21126: variable 'ansible_shell_type' from source: unknown 8313 1726877633.21129: variable 'ansible_shell_executable' from source: unknown 8313 1726877633.21133: variable 'ansible_host' from source: host vars for 'managed_node2' 8313 1726877633.21137: variable 'ansible_pipelining' from source: unknown 8313 1726877633.21140: variable 'ansible_timeout' from source: unknown 8313 1726877633.21144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8313 1726877633.21244: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8313 1726877633.21257: variable 'omit' from source: magic vars 8313 1726877633.21263: starting attempt loop 8313 1726877633.21266: running the handler 8313 1726877633.21274: handler run complete 8313 1726877633.21307: attempt loop complete, returning result 8313 1726877633.21312: _execute() done 8313 1726877633.21315: dumping result to json 8313 1726877633.21317: done dumping result, returning 8313 1726877633.21321: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available [0afffc7c-1039-1dfd-2729-000000000023] 8313 1726877633.21326: sending task result for task 0afffc7c-1039-1dfd-2729-000000000023 8313 1726877633.21344: done sending task result for task 0afffc7c-1039-1dfd-2729-000000000023 8313 1726877633.21346: WORKER PROCESS EXITING 8299 1726877633.21579: marking managed_node2 as failed 8299 1726877633.21586: marking host managed_node2 failed, current state: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8299 1726877633.21592: ^ failed state is now: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=5, fail_state=2, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8299 1726877633.21594: getting the next task for host managed_node2 8299 1726877633.21597: done getting next task for host managed_node2 8299 1726877633.21598: ^ task is: TASK: Assert that user is notified about missing variables 8299 1726877633.21599: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=0, handlers=0, run_state=2, fail_state=2, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False fatal: [managed_node2]: FAILED! => { "changed": false } MSG: Variable ad_integration_realm must be provided! 8299 1726877633.21703: no more pending results, returning what we have 8299 1726877633.21705: results queue empty 8299 1726877633.21706: checking for any_errors_fatal 8299 1726877633.21708: done checking for any_errors_fatal 8299 1726877633.21709: checking for max_fail_percentage 8299 1726877633.21709: done checking for max_fail_percentage 8299 1726877633.21709: checking to see if all hosts have failed and the running result is not ok 8299 1726877633.21710: done checking to see if all hosts have failed 8299 1726877633.21710: getting the remaining hosts for this loop 8299 1726877633.21711: done getting the remaining hosts for this loop 8299 1726877633.21713: getting the next task for host managed_node2 8299 1726877633.21714: done getting next task for host managed_node2 8299 1726877633.21715: ^ task is: TASK: Assert that user is notified about missing variables 8299 1726877633.21716: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=0, handlers=0, run_state=2, fail_state=2, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8299 1726877633.21719: getting variables 8299 1726877633.21720: in VariableManager get_vars() 8299 1726877633.21747: Calling all_inventory to load vars for managed_node2 8299 1726877633.21749: Calling groups_inventory to load vars for managed_node2 8299 1726877633.21751: Calling all_plugins_inventory to load vars for managed_node2 8299 1726877633.21762: Calling all_plugins_play to load vars for managed_node2 8299 1726877633.21766: Calling groups_plugins_inventory to load vars for managed_node2 8299 1726877633.21768: Calling groups_plugins_play to load vars for managed_node2 8299 1726877633.21793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8299 1726877633.21804: done with get_vars() 8299 1726877633.21809: done getting variables 8299 1726877633.21872: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Assert that user is notified about missing variables] ******************** task path: /tmp/collections-Z4Q/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml:16 Friday 20 September 2024 20:13:53 -0400 (0:00:00.020) 0:00:00.101 ****** 8299 1726877633.21890: entering _queue_task() for managed_node2/assert 8299 1726877633.21891: Creating lock for assert 8299 1726877633.22070: worker is 1 (out of 1 available) 8299 1726877633.22081: exiting _queue_task() for managed_node2/assert 8299 1726877633.22091: done queuing things up, now waiting for results queue to drain 8299 1726877633.22092: waiting for pending results... 8314 1726877633.22170: running TaskExecutor() for managed_node2/TASK: Assert that user is notified about missing variables 8314 1726877633.22246: in run() - task 0afffc7c-1039-1dfd-2729-000000000007 8314 1726877633.22261: variable 'ansible_search_path' from source: unknown 8314 1726877633.22287: calling self._execute() 8314 1726877633.22330: variable 'ansible_host' from source: host vars for 'managed_node2' 8314 1726877633.22336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8314 1726877633.22341: variable 'omit' from source: magic vars 8314 1726877633.22410: variable 'omit' from source: magic vars 8314 1726877633.22435: variable 'omit' from source: magic vars 8314 1726877633.22497: variable 'omit' from source: magic vars 8314 1726877633.22523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8314 1726877633.22552: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8314 1726877633.22574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8314 1726877633.22589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8314 1726877633.22601: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8314 1726877633.22621: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8314 1726877633.22626: variable 'ansible_host' from source: host vars for 'managed_node2' 8314 1726877633.22631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8314 1726877633.22703: Set connection var ansible_pipelining to False 8314 1726877633.22711: Set connection var ansible_connection to ssh 8314 1726877633.22717: Set connection var ansible_shell_executable to /bin/sh 8314 1726877633.22722: Set connection var ansible_shell_type to sh 8314 1726877633.22729: Set connection var ansible_module_compression to ZIP_DEFLATED 8314 1726877633.22734: Set connection var ansible_timeout to 10 8314 1726877633.22748: variable 'ansible_shell_executable' from source: unknown 8314 1726877633.22752: variable 'ansible_connection' from source: unknown 8314 1726877633.22757: variable 'ansible_module_compression' from source: unknown 8314 1726877633.22760: variable 'ansible_shell_type' from source: unknown 8314 1726877633.22764: variable 'ansible_shell_executable' from source: unknown 8314 1726877633.22767: variable 'ansible_host' from source: host vars for 'managed_node2' 8314 1726877633.22771: variable 'ansible_pipelining' from source: unknown 8314 1726877633.22774: variable 'ansible_timeout' from source: unknown 8314 1726877633.22779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8314 1726877633.22904: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8314 1726877633.22916: variable 'omit' from source: magic vars 8314 1726877633.22922: starting attempt loop 8314 1726877633.22925: running the handler 8314 1726877633.23239: variable 'ansible_failed_result' from source: set_fact 8314 1726877633.23255: Evaluated conditional ("Variable ad_integration_realm" in ansible_failed_result.msg): True 8314 1726877633.23261: handler run complete 8314 1726877633.23275: attempt loop complete, returning result 8314 1726877633.23279: _execute() done 8314 1726877633.23282: dumping result to json 8314 1726877633.23286: done dumping result, returning 8314 1726877633.23291: done running TaskExecutor() for managed_node2/TASK: Assert that user is notified about missing variables [0afffc7c-1039-1dfd-2729-000000000007] 8314 1726877633.23298: sending task result for task 0afffc7c-1039-1dfd-2729-000000000007 8314 1726877633.23317: done sending task result for task 0afffc7c-1039-1dfd-2729-000000000007 8314 1726877633.23319: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8299 1726877633.23476: no more pending results, returning what we have 8299 1726877633.23478: results queue empty 8299 1726877633.23479: checking for any_errors_fatal 8299 1726877633.23485: done checking for any_errors_fatal 8299 1726877633.23485: checking for max_fail_percentage 8299 1726877633.23486: done checking for max_fail_percentage 8299 1726877633.23487: checking to see if all hosts have failed and the running result is not ok 8299 1726877633.23492: done checking to see if all hosts have failed 8299 1726877633.23492: getting the remaining hosts for this loop 8299 1726877633.23493: done getting the remaining hosts for this loop 8299 1726877633.23496: getting the next task for host managed_node2 8299 1726877633.23501: done getting next task for host managed_node2 8299 1726877633.23502: ^ task is: TASK: meta (flush_handlers) 8299 1726877633.23503: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8299 1726877633.23505: getting variables 8299 1726877633.23506: in VariableManager get_vars() 8299 1726877633.23523: Calling all_inventory to load vars for managed_node2 8299 1726877633.23525: Calling groups_inventory to load vars for managed_node2 8299 1726877633.23527: Calling all_plugins_inventory to load vars for managed_node2 8299 1726877633.23532: Calling all_plugins_play to load vars for managed_node2 8299 1726877633.23534: Calling groups_plugins_inventory to load vars for managed_node2 8299 1726877633.23535: Calling groups_plugins_play to load vars for managed_node2 8299 1726877633.23566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8299 1726877633.23581: done with get_vars() 8299 1726877633.23587: done getting variables 8299 1726877633.23633: in VariableManager get_vars() 8299 1726877633.23678: Calling all_inventory to load vars for managed_node2 8299 1726877633.23680: Calling groups_inventory to load vars for managed_node2 8299 1726877633.23683: Calling all_plugins_inventory to load vars for managed_node2 8299 1726877633.23691: Calling all_plugins_play to load vars for managed_node2 8299 1726877633.23693: Calling groups_plugins_inventory to load vars for managed_node2 8299 1726877633.23699: Calling groups_plugins_play to load vars for managed_node2 8299 1726877633.23722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8299 1726877633.23731: done with get_vars() 8299 1726877633.23738: done queuing things up, now waiting for results queue to drain 8299 1726877633.23739: results queue empty 8299 1726877633.23739: checking for any_errors_fatal 8299 1726877633.23741: done checking for any_errors_fatal 8299 1726877633.23741: checking for max_fail_percentage 8299 1726877633.23742: done checking for max_fail_percentage 8299 1726877633.23742: checking to see if all hosts have failed and the running result is not ok 8299 1726877633.23743: done checking to see if all hosts have failed 8299 1726877633.23743: getting the remaining hosts for this loop 8299 1726877633.23743: done getting the remaining hosts for this loop 8299 1726877633.23745: getting the next task for host managed_node2 8299 1726877633.23747: done getting next task for host managed_node2 8299 1726877633.23751: ^ task is: TASK: meta (flush_handlers) 8299 1726877633.23752: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8299 1726877633.23755: getting variables 8299 1726877633.23756: in VariableManager get_vars() 8299 1726877633.23761: Calling all_inventory to load vars for managed_node2 8299 1726877633.23763: Calling groups_inventory to load vars for managed_node2 8299 1726877633.23766: Calling all_plugins_inventory to load vars for managed_node2 8299 1726877633.23769: Calling all_plugins_play to load vars for managed_node2 8299 1726877633.23770: Calling groups_plugins_inventory to load vars for managed_node2 8299 1726877633.23771: Calling groups_plugins_play to load vars for managed_node2 8299 1726877633.23789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8299 1726877633.23802: done with get_vars() 8299 1726877633.23808: done getting variables 8299 1726877633.23838: in VariableManager get_vars() 8299 1726877633.23844: Calling all_inventory to load vars for managed_node2 8299 1726877633.23845: Calling groups_inventory to load vars for managed_node2 8299 1726877633.23846: Calling all_plugins_inventory to load vars for managed_node2 8299 1726877633.23849: Calling all_plugins_play to load vars for managed_node2 8299 1726877633.23850: Calling groups_plugins_inventory to load vars for managed_node2 8299 1726877633.23851: Calling groups_plugins_play to load vars for managed_node2 8299 1726877633.23872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8299 1726877633.23880: done with get_vars() 8299 1726877633.23887: done queuing things up, now waiting for results queue to drain 8299 1726877633.23888: results queue empty 8299 1726877633.23888: checking for any_errors_fatal 8299 1726877633.23890: done checking for any_errors_fatal 8299 1726877633.23890: checking for max_fail_percentage 8299 1726877633.23890: done checking for max_fail_percentage 8299 1726877633.23891: checking to see if all hosts have failed and the running result is not ok 8299 1726877633.23891: done checking to see if all hosts have failed 8299 1726877633.23891: getting the remaining hosts for this loop 8299 1726877633.23892: done getting the remaining hosts for this loop 8299 1726877633.23893: getting the next task for host managed_node2 8299 1726877633.23895: done getting next task for host managed_node2 8299 1726877633.23895: ^ task is: None 8299 1726877633.23896: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8299 1726877633.23896: done queuing things up, now waiting for results queue to drain 8299 1726877633.23897: results queue empty 8299 1726877633.23897: checking for any_errors_fatal 8299 1726877633.23897: done checking for any_errors_fatal 8299 1726877633.23898: checking for max_fail_percentage 8299 1726877633.23898: done checking for max_fail_percentage 8299 1726877633.23898: checking to see if all hosts have failed and the running result is not ok 8299 1726877633.23899: done checking to see if all hosts have failed 8299 1726877633.23899: getting the next task for host managed_node2 8299 1726877633.23901: done getting next task for host managed_node2 8299 1726877633.23901: ^ task is: None 8299 1726877633.23902: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=1 ignored=0 Friday 20 September 2024 20:13:53 -0400 (0:00:00.020) 0:00:00.122 ****** =============================================================================== Include the role -------------------------------------------------------- 0.07s /tmp/collections-Z4Q/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml:11 Assert that user is notified about missing variables -------------------- 0.02s /tmp/collections-Z4Q/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml:16 fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available --- 0.02s /tmp/collections-Z4Q/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:3 8299 1726877633.23974: RUNNING CLEANUP