[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 3083 1726877634.55907: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Hbq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 3083 1726877634.56271: Added group all to inventory 3083 1726877634.56274: Added group ungrouped to inventory 3083 1726877634.56277: Group all now contains ungrouped 3083 1726877634.56280: Examining possible inventory source: /tmp/ad_integration-4Yb/inventory.yml 3083 1726877634.68267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 3083 1726877634.68321: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 3083 1726877634.68342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 3083 1726877634.68391: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 3083 1726877634.68453: Loaded config def from plugin (inventory/script) 3083 1726877634.68455: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 3083 1726877634.68489: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 3083 1726877634.68561: Loaded config def from plugin (inventory/yaml) 3083 1726877634.68563: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 3083 1726877634.68634: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 3083 1726877634.68987: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 3083 1726877634.68990: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 3083 1726877634.68994: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 3083 1726877634.68999: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 3083 1726877634.69003: Loading data from /tmp/ad_integration-4Yb/inventory.yml 3083 1726877634.69060: /tmp/ad_integration-4Yb/inventory.yml was not parsable by auto 3083 1726877634.69113: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 3083 1726877634.69149: Loading data from /tmp/ad_integration-4Yb/inventory.yml 3083 1726877634.69215: group all already in inventory 3083 1726877634.69221: set inventory_file for managed_node1 3083 1726877634.69224: set inventory_dir for managed_node1 3083 1726877634.69225: Added host managed_node1 to inventory 3083 1726877634.69227: Added host managed_node1 to group all 3083 1726877634.69227: set ansible_host for managed_node1 3083 1726877634.69228: set ansible_ssh_extra_args for managed_node1 3083 1726877634.69230: set inventory_file for managed_node2 3083 1726877634.69232: set inventory_dir for managed_node2 3083 1726877634.69233: Added host managed_node2 to inventory 3083 1726877634.69234: Added host managed_node2 to group all 3083 1726877634.69235: set ansible_host for managed_node2 3083 1726877634.69236: set ansible_ssh_extra_args for managed_node2 3083 1726877634.69239: set inventory_file for managed_node3 3083 1726877634.69242: set inventory_dir for managed_node3 3083 1726877634.69243: Added host managed_node3 to inventory 3083 1726877634.69245: Added host managed_node3 to group all 3083 1726877634.69245: set ansible_host for managed_node3 3083 1726877634.69246: set ansible_ssh_extra_args for managed_node3 3083 1726877634.69248: Reconcile groups and hosts in inventory. 3083 1726877634.69251: Group ungrouped now contains managed_node1 3083 1726877634.69253: Group ungrouped now contains managed_node2 3083 1726877634.69254: Group ungrouped now contains managed_node3 3083 1726877634.69315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 3083 1726877634.69419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 3083 1726877634.69461: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 3083 1726877634.69485: Loaded config def from plugin (vars/host_group_vars) 3083 1726877634.69486: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 3083 1726877634.69492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 3083 1726877634.69500: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 3083 1726877634.69534: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 3083 1726877634.69801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 3083 1726877634.69876: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 3083 1726877634.69913: Loaded config def from plugin (connection/local) 3083 1726877634.69916: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 3083 1726877634.70435: Loaded config def from plugin (connection/paramiko_ssh) 3083 1726877634.70441: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 3083 1726877634.71160: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 3083 1726877634.71192: Loaded config def from plugin (connection/psrp) 3083 1726877634.71197: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 3083 1726877634.71788: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 3083 1726877634.71821: Loaded config def from plugin (connection/ssh) 3083 1726877634.71824: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 3083 1726877634.73382: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 3083 1726877634.73418: Loaded config def from plugin (connection/winrm) 3083 1726877634.73421: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 3083 1726877634.73447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 3083 1726877634.73502: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 3083 1726877634.73558: Loaded config def from plugin (shell/cmd) 3083 1726877634.73560: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) [WARNING]: Could not match supplied host pattern, ignoring: ad 3083 1726877634.73580: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 3083 1726877634.73635: Loaded config def from plugin (shell/powershell) 3083 1726877634.73639: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 3083 1726877634.73680: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 3083 1726877634.73827: Loaded config def from plugin (shell/sh) 3083 1726877634.73829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 3083 1726877634.73857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 3083 1726877634.73962: Loaded config def from plugin (become/runas) 3083 1726877634.73964: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 3083 1726877634.74116: Loaded config def from plugin (become/su) 3083 1726877634.74118: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 3083 1726877634.74253: Loaded config def from plugin (become/sudo) 3083 1726877634.74254: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 3083 1726877634.74282: Loading data from /tmp/collections-Hbq/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml 3083 1726877634.74634: trying /usr/local/lib/python3.12/site-packages/ansible/modules 3083 1726877634.76913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 3083 1726877634.77010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 3083 1726877634.77021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 3083 1726877634.77215: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 3083 1726877634.77344: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 3083 1726877634.77346: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Hbq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 3083 1726877634.77371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 3083 1726877634.77390: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 3083 1726877634.77533: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 3083 1726877634.77584: Loaded config def from plugin (callback/default) 3083 1726877634.77586: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 3083 1726877634.78501: Loaded config def from plugin (callback/junit) 3083 1726877634.78503: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 3083 1726877634.78540: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 3083 1726877634.78592: Loaded config def from plugin (callback/minimal) 3083 1726877634.78595: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 3083 1726877634.78628: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 3083 1726877634.78678: Loaded config def from plugin (callback/tree) 3083 1726877634.78680: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 3083 1726877634.78781: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 3083 1726877634.78783: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Hbq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /tmp/collections-Hbq/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml 3083 1726877634.78805: in VariableManager get_vars() 3083 1726877634.78829: Could not match supplied host pattern, ignoring: ad 3083 1726877634.78845: done with get_vars() 3083 1726877634.78851: in VariableManager get_vars() 3083 1726877634.78859: done with get_vars() 3083 1726877634.78863: variable 'omit' from source: magic vars 3083 1726877634.78892: in VariableManager get_vars() 3083 1726877634.78904: done with get_vars() 3083 1726877634.78920: variable 'omit' from source: magic vars PLAY [Ensure role behaviour with default parameters] *************************** 3083 1726877634.79335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 3083 1726877634.79396: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 3083 1726877634.79422: getting the remaining hosts for this loop 3083 1726877634.79424: done getting the remaining hosts for this loop 3083 1726877634.79426: getting the next task for host managed_node2 3083 1726877634.79430: done getting next task for host managed_node2 3083 1726877634.79431: ^ task is: TASK: meta (flush_handlers) 3083 1726877634.79432: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3083 1726877634.79440: getting variables 3083 1726877634.79440: in VariableManager get_vars() 3083 1726877634.79448: Calling all_inventory to load vars for managed_node2 3083 1726877634.79450: Calling groups_inventory to load vars for managed_node2 3083 1726877634.79452: Calling all_plugins_inventory to load vars for managed_node2 3083 1726877634.79462: Calling all_plugins_play to load vars for managed_node2 3083 1726877634.79470: Calling groups_plugins_inventory to load vars for managed_node2 3083 1726877634.79474: Calling groups_plugins_play to load vars for managed_node2 3083 1726877634.79505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 3083 1726877634.79546: done with get_vars() 3083 1726877634.79551: done getting variables 3083 1726877634.79583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 3083 1726877634.79629: in VariableManager get_vars() 3083 1726877634.79636: Calling all_inventory to load vars for managed_node2 3083 1726877634.79640: Calling groups_inventory to load vars for managed_node2 3083 1726877634.79642: Calling all_plugins_inventory to load vars for managed_node2 3083 1726877634.79645: Calling all_plugins_play to load vars for managed_node2 3083 1726877634.79647: Calling groups_plugins_inventory to load vars for managed_node2 3083 1726877634.79649: Calling groups_plugins_play to load vars for managed_node2 3083 1726877634.79671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 3083 1726877634.79681: done with get_vars() 3083 1726877634.79689: done queuing things up, now waiting for results queue to drain 3083 1726877634.79690: results queue empty 3083 1726877634.79691: checking for any_errors_fatal 3083 1726877634.79692: done checking for any_errors_fatal 3083 1726877634.79692: checking for max_fail_percentage 3083 1726877634.79697: done checking for max_fail_percentage 3083 1726877634.79698: checking to see if all hosts have failed and the running result is not ok 3083 1726877634.79698: done checking to see if all hosts have failed 3083 1726877634.79699: getting the remaining hosts for this loop 3083 1726877634.79700: done getting the remaining hosts for this loop 3083 1726877634.79702: getting the next task for host managed_node2 3083 1726877634.79705: done getting next task for host managed_node2 3083 1726877634.79706: ^ task is: TASK: Include the role 3083 1726877634.79708: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3083 1726877634.79709: getting variables 3083 1726877634.79710: in VariableManager get_vars() 3083 1726877634.79727: Calling all_inventory to load vars for managed_node2 3083 1726877634.79729: Calling groups_inventory to load vars for managed_node2 3083 1726877634.79731: Calling all_plugins_inventory to load vars for managed_node2 3083 1726877634.79734: Calling all_plugins_play to load vars for managed_node2 3083 1726877634.79738: Calling groups_plugins_inventory to load vars for managed_node2 3083 1726877634.79741: Calling groups_plugins_play to load vars for managed_node2 3083 1726877634.79762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 3083 1726877634.79773: done with get_vars() 3083 1726877634.79777: done getting variables TASK [Include the role] ******************************************************** task path: /tmp/collections-Hbq/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml:11 Friday 20 September 2024 20:13:54 -0400 (0:00:00.011) 0:00:00.011 ****** 3083 1726877634.79834: entering _queue_task() for managed_node2/include_role 3083 1726877634.79835: Creating lock for include_role 3083 1726877634.80083: worker is 1 (out of 1 available) 3083 1726877634.80096: exiting _queue_task() for managed_node2/include_role 3083 1726877634.80108: done queuing things up, now waiting for results queue to drain 3083 1726877634.80110: waiting for pending results... 3083 1726877634.80231: running TaskExecutor() for managed_node2/TASK: Include the role 3083 1726877634.80297: in run() - task 1225e0f7-9bb9-a3d5-0853-000000000006 3083 1726877634.80308: variable 'ansible_search_path' from source: unknown 3083 1726877634.80342: calling self._execute() 3083 1726877634.80392: variable 'ansible_host' from source: host vars for 'managed_node2' 3083 1726877634.80400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 3083 1726877634.80409: variable 'omit' from source: magic vars 3083 1726877634.80490: _execute() done 3083 1726877634.80495: dumping result to json 3083 1726877634.80499: done dumping result, returning 3083 1726877634.80504: done running TaskExecutor() for managed_node2/TASK: Include the role [1225e0f7-9bb9-a3d5-0853-000000000006] 3083 1726877634.80510: sending task result for task 1225e0f7-9bb9-a3d5-0853-000000000006 3083 1726877634.80620: done sending task result for task 1225e0f7-9bb9-a3d5-0853-000000000006 3083 1726877634.80624: WORKER PROCESS EXITING 3083 1726877634.80671: no more pending results, returning what we have 3083 1726877634.80676: in VariableManager get_vars() 3083 1726877634.80703: Calling all_inventory to load vars for managed_node2 3083 1726877634.80706: Calling groups_inventory to load vars for managed_node2 3083 1726877634.80710: Calling all_plugins_inventory to load vars for managed_node2 3083 1726877634.80719: Calling all_plugins_play to load vars for managed_node2 3083 1726877634.80721: Calling groups_plugins_inventory to load vars for managed_node2 3083 1726877634.80724: Calling groups_plugins_play to load vars for managed_node2 3083 1726877634.80759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 3083 1726877634.80770: done with get_vars() 3083 1726877634.80774: variable 'ansible_search_path' from source: unknown 3083 1726877634.80831: variable 'omit' from source: magic vars 3083 1726877634.80853: variable 'omit' from source: magic vars 3083 1726877634.80864: variable 'omit' from source: magic vars 3083 1726877634.80867: we have included files to process 3083 1726877634.80868: generating all_blocks data 3083 1726877634.80869: done generating all_blocks data 3083 1726877634.80869: processing included file: fedora.linux_system_roles.ad_integration 3083 1726877634.80886: in VariableManager get_vars() 3083 1726877634.80896: done with get_vars() 3083 1726877634.80947: in VariableManager get_vars() 3083 1726877634.80959: done with get_vars() 3083 1726877634.80991: Loading data from /tmp/collections-Hbq/ansible_collections/fedora/linux_system_roles/roles/ad_integration/vars/main.yml 3083 1726877634.81198: Loading data from /tmp/collections-Hbq/ansible_collections/fedora/linux_system_roles/roles/ad_integration/defaults/main.yml 3083 1726877634.81300: Loading data from /tmp/collections-Hbq/ansible_collections/fedora/linux_system_roles/roles/ad_integration/meta/main.yml 3083 1726877634.81411: Loading data from /tmp/collections-Hbq/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml 3083 1726877634.84021: in VariableManager get_vars() 3083 1726877634.84039: done with get_vars() 3083 1726877634.86254: Loading data from /tmp/collections-Hbq/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml 3083 1726877634.86811: iterating over new_blocks loaded from include file 3083 1726877634.86813: in VariableManager get_vars() 3083 1726877634.86823: done with get_vars() 3083 1726877634.86824: filtering new block on tags 3083 1726877634.86872: done filtering new block on tags 3083 1726877634.86875: in VariableManager get_vars() 3083 1726877634.86883: done with get_vars() 3083 1726877634.86884: filtering new block on tags 3083 1726877634.86898: done filtering new block on tags 3083 1726877634.86899: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.ad_integration for managed_node2 3083 1726877634.86903: extending task lists for all hosts with included blocks 3083 1726877634.86941: done extending task lists 3083 1726877634.86942: done processing included files 3083 1726877634.86942: results queue empty 3083 1726877634.86943: checking for any_errors_fatal 3083 1726877634.86944: done checking for any_errors_fatal 3083 1726877634.86945: checking for max_fail_percentage 3083 1726877634.86946: done checking for max_fail_percentage 3083 1726877634.86947: checking to see if all hosts have failed and the running result is not ok 3083 1726877634.86947: done checking to see if all hosts have failed 3083 1726877634.86948: getting the remaining hosts for this loop 3083 1726877634.86949: done getting the remaining hosts for this loop 3083 1726877634.86950: getting the next task for host managed_node2 3083 1726877634.86953: done getting next task for host managed_node2 3083 1726877634.86955: ^ task is: TASK: fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available 3083 1726877634.86957: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3083 1726877634.86965: getting variables 3083 1726877634.86966: in VariableManager get_vars() 3083 1726877634.86975: Calling all_inventory to load vars for managed_node2 3083 1726877634.86977: Calling groups_inventory to load vars for managed_node2 3083 1726877634.86988: Calling all_plugins_inventory to load vars for managed_node2 3083 1726877634.86992: Calling all_plugins_play to load vars for managed_node2 3083 1726877634.86996: Calling groups_plugins_inventory to load vars for managed_node2 3083 1726877634.86999: Calling groups_plugins_play to load vars for managed_node2 3083 1726877634.87020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 3083 1726877634.87039: done with get_vars() 3083 1726877634.87045: done getting variables 3083 1726877634.87100: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available] *** task path: /tmp/collections-Hbq/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:3 Friday 20 September 2024 20:13:54 -0400 (0:00:00.072) 0:00:00.084 ****** 3083 1726877634.87121: entering _queue_task() for managed_node2/fail 3083 1726877634.87122: Creating lock for fail 3083 1726877634.87334: worker is 1 (out of 1 available) 3083 1726877634.87349: exiting _queue_task() for managed_node2/fail 3083 1726877634.87360: done queuing things up, now waiting for results queue to drain 3083 1726877634.87362: waiting for pending results... 3083 1726877634.87509: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available 3083 1726877634.87574: in run() - task 1225e0f7-9bb9-a3d5-0853-000000000023 3083 1726877634.87587: variable 'ansible_search_path' from source: unknown 3083 1726877634.87591: variable 'ansible_search_path' from source: unknown 3083 1726877634.87624: calling self._execute() 3083 1726877634.87676: variable 'ansible_host' from source: host vars for 'managed_node2' 3083 1726877634.87683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 3083 1726877634.87691: variable 'omit' from source: magic vars 3083 1726877634.88050: variable 'ad_integration_realm' from source: role '' defaults 3083 1726877634.88057: Evaluated conditional (not ad_integration_realm): True 3083 1726877634.88063: variable 'omit' from source: magic vars 3083 1726877634.88099: variable 'omit' from source: magic vars 3083 1726877634.88127: variable 'omit' from source: magic vars 3083 1726877634.88163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 3083 1726877634.88194: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 3083 1726877634.88212: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 3083 1726877634.88228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 3083 1726877634.88240: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 3083 1726877634.88267: variable 'inventory_hostname' from source: host vars for 'managed_node2' 3083 1726877634.88271: variable 'ansible_host' from source: host vars for 'managed_node2' 3083 1726877634.88277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 3083 1726877634.88351: Set connection var ansible_connection to ssh 3083 1726877634.88355: Set connection var ansible_shell_type to sh 3083 1726877634.88366: Set connection var ansible_module_compression to ZIP_DEFLATED 3083 1726877634.88372: Set connection var ansible_timeout to 10 3083 1726877634.88380: Set connection var ansible_shell_executable to /bin/sh 3083 1726877634.88390: Set connection var ansible_pipelining to False 3083 1726877634.88410: variable 'ansible_shell_executable' from source: unknown 3083 1726877634.88413: variable 'ansible_connection' from source: unknown 3083 1726877634.88416: variable 'ansible_module_compression' from source: unknown 3083 1726877634.88420: variable 'ansible_shell_type' from source: unknown 3083 1726877634.88423: variable 'ansible_shell_executable' from source: unknown 3083 1726877634.88427: variable 'ansible_host' from source: host vars for 'managed_node2' 3083 1726877634.88432: variable 'ansible_pipelining' from source: unknown 3083 1726877634.88435: variable 'ansible_timeout' from source: unknown 3083 1726877634.88477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 3083 1726877634.88568: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 3083 1726877634.88579: variable 'omit' from source: magic vars 3083 1726877634.88582: starting attempt loop 3083 1726877634.88591: running the handler 3083 1726877634.88604: handler run complete 3083 1726877634.88629: attempt loop complete, returning result 3083 1726877634.88632: _execute() done 3083 1726877634.88635: dumping result to json 3083 1726877634.88642: done dumping result, returning 3083 1726877634.88648: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available [1225e0f7-9bb9-a3d5-0853-000000000023] 3083 1726877634.88653: sending task result for task 1225e0f7-9bb9-a3d5-0853-000000000023 3083 1726877634.88742: done sending task result for task 1225e0f7-9bb9-a3d5-0853-000000000023 3083 1726877634.88746: WORKER PROCESS EXITING 3083 1726877634.88754: marking managed_node2 as failed 3083 1726877634.88762: marking host managed_node2 failed, current state: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3083 1726877634.88769: ^ failed state is now: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=5, fail_state=2, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3083 1726877634.88772: getting the next task for host managed_node2 3083 1726877634.88777: done getting next task for host managed_node2 3083 1726877634.88780: ^ task is: TASK: Assert that user is notified about missing variables 3083 1726877634.88781: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=0, handlers=0, run_state=2, fail_state=2, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False fatal: [managed_node2]: FAILED! => { "changed": false } MSG: Variable ad_integration_realm must be provided! 3083 1726877634.88930: no more pending results, returning what we have 3083 1726877634.88933: results queue empty 3083 1726877634.88933: checking for any_errors_fatal 3083 1726877634.88934: done checking for any_errors_fatal 3083 1726877634.88935: checking for max_fail_percentage 3083 1726877634.88936: done checking for max_fail_percentage 3083 1726877634.88938: checking to see if all hosts have failed and the running result is not ok 3083 1726877634.88939: done checking to see if all hosts have failed 3083 1726877634.88940: getting the remaining hosts for this loop 3083 1726877634.88941: done getting the remaining hosts for this loop 3083 1726877634.88943: getting the next task for host managed_node2 3083 1726877634.88946: done getting next task for host managed_node2 3083 1726877634.88947: ^ task is: TASK: Assert that user is notified about missing variables 3083 1726877634.88949: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=0, handlers=0, run_state=2, fail_state=2, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3083 1726877634.88953: getting variables 3083 1726877634.88953: in VariableManager get_vars() 3083 1726877634.88971: Calling all_inventory to load vars for managed_node2 3083 1726877634.88972: Calling groups_inventory to load vars for managed_node2 3083 1726877634.88975: Calling all_plugins_inventory to load vars for managed_node2 3083 1726877634.88985: Calling all_plugins_play to load vars for managed_node2 3083 1726877634.88988: Calling groups_plugins_inventory to load vars for managed_node2 3083 1726877634.88991: Calling groups_plugins_play to load vars for managed_node2 3083 1726877634.89023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 3083 1726877634.89036: done with get_vars() 3083 1726877634.89043: done getting variables 3083 1726877634.89118: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Assert that user is notified about missing variables] ******************** task path: /tmp/collections-Hbq/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml:16 Friday 20 September 2024 20:13:54 -0400 (0:00:00.020) 0:00:00.104 ****** 3083 1726877634.89140: entering _queue_task() for managed_node2/assert 3083 1726877634.89141: Creating lock for assert 3083 1726877634.89321: worker is 1 (out of 1 available) 3083 1726877634.89335: exiting _queue_task() for managed_node2/assert 3083 1726877634.89347: done queuing things up, now waiting for results queue to drain 3083 1726877634.89349: waiting for pending results... 3083 1726877634.89482: running TaskExecutor() for managed_node2/TASK: Assert that user is notified about missing variables 3083 1726877634.89533: in run() - task 1225e0f7-9bb9-a3d5-0853-000000000007 3083 1726877634.89545: variable 'ansible_search_path' from source: unknown 3083 1726877634.89578: calling self._execute() 3083 1726877634.89624: variable 'ansible_host' from source: host vars for 'managed_node2' 3083 1726877634.89631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 3083 1726877634.89642: variable 'omit' from source: magic vars 3083 1726877634.89753: variable 'omit' from source: magic vars 3083 1726877634.89778: variable 'omit' from source: magic vars 3083 1726877634.89811: variable 'omit' from source: magic vars 3083 1726877634.89903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 3083 1726877634.89907: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 3083 1726877634.89910: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 3083 1726877634.89912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 3083 1726877634.89915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 3083 1726877634.89942: variable 'inventory_hostname' from source: host vars for 'managed_node2' 3083 1726877634.89945: variable 'ansible_host' from source: host vars for 'managed_node2' 3083 1726877634.89947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 3083 1726877634.90022: Set connection var ansible_connection to ssh 3083 1726877634.90028: Set connection var ansible_shell_type to sh 3083 1726877634.90097: Set connection var ansible_module_compression to ZIP_DEFLATED 3083 1726877634.90099: Set connection var ansible_timeout to 10 3083 1726877634.90101: Set connection var ansible_shell_executable to /bin/sh 3083 1726877634.90102: Set connection var ansible_pipelining to False 3083 1726877634.90104: variable 'ansible_shell_executable' from source: unknown 3083 1726877634.90105: variable 'ansible_connection' from source: unknown 3083 1726877634.90106: variable 'ansible_module_compression' from source: unknown 3083 1726877634.90107: variable 'ansible_shell_type' from source: unknown 3083 1726877634.90109: variable 'ansible_shell_executable' from source: unknown 3083 1726877634.90110: variable 'ansible_host' from source: host vars for 'managed_node2' 3083 1726877634.90111: variable 'ansible_pipelining' from source: unknown 3083 1726877634.90112: variable 'ansible_timeout' from source: unknown 3083 1726877634.90114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 3083 1726877634.90226: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 3083 1726877634.90235: variable 'omit' from source: magic vars 3083 1726877634.90241: starting attempt loop 3083 1726877634.90244: running the handler 3083 1726877634.90528: variable 'ansible_failed_result' from source: set_fact 3083 1726877634.90564: Evaluated conditional ("Variable ad_integration_realm" in ansible_failed_result.msg): True 3083 1726877634.90567: handler run complete 3083 1726877634.90570: attempt loop complete, returning result 3083 1726877634.90574: _execute() done 3083 1726877634.90577: dumping result to json 3083 1726877634.90579: done dumping result, returning 3083 1726877634.90581: done running TaskExecutor() for managed_node2/TASK: Assert that user is notified about missing variables [1225e0f7-9bb9-a3d5-0853-000000000007] 3083 1726877634.90584: sending task result for task 1225e0f7-9bb9-a3d5-0853-000000000007 3083 1726877634.90658: done sending task result for task 1225e0f7-9bb9-a3d5-0853-000000000007 3083 1726877634.90661: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 3083 1726877634.90735: no more pending results, returning what we have 3083 1726877634.90741: results queue empty 3083 1726877634.90742: checking for any_errors_fatal 3083 1726877634.90746: done checking for any_errors_fatal 3083 1726877634.90747: checking for max_fail_percentage 3083 1726877634.90749: done checking for max_fail_percentage 3083 1726877634.90750: checking to see if all hosts have failed and the running result is not ok 3083 1726877634.90751: done checking to see if all hosts have failed 3083 1726877634.90752: getting the remaining hosts for this loop 3083 1726877634.90753: done getting the remaining hosts for this loop 3083 1726877634.90757: getting the next task for host managed_node2 3083 1726877634.90763: done getting next task for host managed_node2 3083 1726877634.90765: ^ task is: TASK: meta (flush_handlers) 3083 1726877634.90767: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3083 1726877634.90769: getting variables 3083 1726877634.90770: in VariableManager get_vars() 3083 1726877634.90792: Calling all_inventory to load vars for managed_node2 3083 1726877634.90796: Calling groups_inventory to load vars for managed_node2 3083 1726877634.90798: Calling all_plugins_inventory to load vars for managed_node2 3083 1726877634.90805: Calling all_plugins_play to load vars for managed_node2 3083 1726877634.90807: Calling groups_plugins_inventory to load vars for managed_node2 3083 1726877634.90809: Calling groups_plugins_play to load vars for managed_node2 3083 1726877634.90842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 3083 1726877634.90855: done with get_vars() 3083 1726877634.90861: done getting variables 3083 1726877634.90916: in VariableManager get_vars() 3083 1726877634.90944: Calling all_inventory to load vars for managed_node2 3083 1726877634.90946: Calling groups_inventory to load vars for managed_node2 3083 1726877634.90948: Calling all_plugins_inventory to load vars for managed_node2 3083 1726877634.90951: Calling all_plugins_play to load vars for managed_node2 3083 1726877634.90953: Calling groups_plugins_inventory to load vars for managed_node2 3083 1726877634.90955: Calling groups_plugins_play to load vars for managed_node2 3083 1726877634.90976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 3083 1726877634.90988: done with get_vars() 3083 1726877634.90999: done queuing things up, now waiting for results queue to drain 3083 1726877634.91000: results queue empty 3083 1726877634.91001: checking for any_errors_fatal 3083 1726877634.91002: done checking for any_errors_fatal 3083 1726877634.91003: checking for max_fail_percentage 3083 1726877634.91004: done checking for max_fail_percentage 3083 1726877634.91004: checking to see if all hosts have failed and the running result is not ok 3083 1726877634.91005: done checking to see if all hosts have failed 3083 1726877634.91005: getting the remaining hosts for this loop 3083 1726877634.91006: done getting the remaining hosts for this loop 3083 1726877634.91008: getting the next task for host managed_node2 3083 1726877634.91011: done getting next task for host managed_node2 3083 1726877634.91016: ^ task is: TASK: meta (flush_handlers) 3083 1726877634.91017: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3083 1726877634.91019: getting variables 3083 1726877634.91020: in VariableManager get_vars() 3083 1726877634.91026: Calling all_inventory to load vars for managed_node2 3083 1726877634.91027: Calling groups_inventory to load vars for managed_node2 3083 1726877634.91029: Calling all_plugins_inventory to load vars for managed_node2 3083 1726877634.91032: Calling all_plugins_play to load vars for managed_node2 3083 1726877634.91034: Calling groups_plugins_inventory to load vars for managed_node2 3083 1726877634.91036: Calling groups_plugins_play to load vars for managed_node2 3083 1726877634.91059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 3083 1726877634.91069: done with get_vars() 3083 1726877634.91074: done getting variables 3083 1726877634.91113: in VariableManager get_vars() 3083 1726877634.91120: Calling all_inventory to load vars for managed_node2 3083 1726877634.91121: Calling groups_inventory to load vars for managed_node2 3083 1726877634.91123: Calling all_plugins_inventory to load vars for managed_node2 3083 1726877634.91126: Calling all_plugins_play to load vars for managed_node2 3083 1726877634.91128: Calling groups_plugins_inventory to load vars for managed_node2 3083 1726877634.91130: Calling groups_plugins_play to load vars for managed_node2 3083 1726877634.91152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 3083 1726877634.91162: done with get_vars() 3083 1726877634.91169: done queuing things up, now waiting for results queue to drain 3083 1726877634.91170: results queue empty 3083 1726877634.91171: checking for any_errors_fatal 3083 1726877634.91172: done checking for any_errors_fatal 3083 1726877634.91172: checking for max_fail_percentage 3083 1726877634.91173: done checking for max_fail_percentage 3083 1726877634.91173: checking to see if all hosts have failed and the running result is not ok 3083 1726877634.91174: done checking to see if all hosts have failed 3083 1726877634.91174: getting the remaining hosts for this loop 3083 1726877634.91175: done getting the remaining hosts for this loop 3083 1726877634.91177: getting the next task for host managed_node2 3083 1726877634.91179: done getting next task for host managed_node2 3083 1726877634.91180: ^ task is: None 3083 1726877634.91181: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3083 1726877634.91182: done queuing things up, now waiting for results queue to drain 3083 1726877634.91182: results queue empty 3083 1726877634.91183: checking for any_errors_fatal 3083 1726877634.91183: done checking for any_errors_fatal 3083 1726877634.91184: checking for max_fail_percentage 3083 1726877634.91185: done checking for max_fail_percentage 3083 1726877634.91185: checking to see if all hosts have failed and the running result is not ok 3083 1726877634.91186: done checking to see if all hosts have failed 3083 1726877634.91187: getting the next task for host managed_node2 3083 1726877634.91189: done getting next task for host managed_node2 3083 1726877634.91189: ^ task is: None 3083 1726877634.91190: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=2 changed=0 unreachable=0 failed=0 skipped=0 rescued=1 ignored=0 Friday 20 September 2024 20:13:54 -0400 (0:00:00.021) 0:00:00.125 ****** =============================================================================== Include the role -------------------------------------------------------- 0.07s /tmp/collections-Hbq/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml:11 Assert that user is notified about missing variables -------------------- 0.02s /tmp/collections-Hbq/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml:16 fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available --- 0.02s /tmp/collections-Hbq/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:3 3083 1726877634.91248: RUNNING CLEANUP