[WARNING]: Could not match supplied host pattern, ignoring: ad ansible-playbook [core 2.17.13] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-HVo executable location = /usr/local/bin/ansible-playbook python version = 3.12.11 (main, Jun 12 2025, 00:00:00) [GCC 14.2.1 20250110 (Red Hat 14.2.1-8)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_dyndns.yml ***************************************************** 1 plays in /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml PLAY [Ensure that the role configures dynamic dns] ***************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:20 Saturday 09 August 2025 08:22:05 -0400 (0:00:00.018) 0:00:00.018 ******* [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node2] TASK [Setup fake realm] ******************************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:33 Saturday 09 August 2025 08:22:07 -0400 (0:00:02.349) 0:00:02.367 ******* included: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml for managed-node2 TASK [Get role variables] ****************************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:6 Saturday 09 August 2025 08:22:07 -0400 (0:00:00.047) 0:00:02.414 ******* included: fedora.linux_system_roles.ad_integration for managed-node2 TASK [fedora.linux_system_roles.ad_integration : Ensure ansible_facts used by role] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:2 Saturday 09 August 2025 08:22:07 -0400 (0:00:00.032) 0:00:02.447 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Check if system is ostree] **** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Saturday 09 August 2025 08:22:07 -0400 (0:00:00.035) 0:00:02.482 ******* ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.ad_integration : Set flag to indicate system is ostree] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:15 Saturday 09 August 2025 08:22:08 -0400 (0:00:00.442) 0:00:02.925 ******* ok: [managed-node2] => { "ansible_facts": { "__ad_integration_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 Saturday 09 August 2025 08:22:08 -0400 (0:00:00.024) 0:00:02.949 ******* skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_10.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_10.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_10.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_10.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Create a temp file for fake realm cmd] *********************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:12 Saturday 09 August 2025 08:22:08 -0400 (0:00:00.037) 0:00:02.987 ******* changed: [managed-node2] => { "changed": true, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/lsr_m78aggly_ad_int_realm.py", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Set realm cmd variable for remainder of test] **************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:18 Saturday 09 August 2025 08:22:08 -0400 (0:00:00.418) 0:00:03.406 ******* ok: [managed-node2] => { "ansible_facts": { "__ad_integration_realm_cmd": "/tmp/lsr_m78aggly_ad_int_realm.py" }, "changed": false } TASK [Create fake realm cmd] *************************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:22 Saturday 09 August 2025 08:22:08 -0400 (0:00:00.018) 0:00:03.424 ******* changed: [managed-node2] => { "changed": true, "checksum": "e790736e234dc497bc030f93a903e707179c5412", "dest": "/tmp/lsr_m78aggly_ad_int_realm.py", "gid": 0, "group": "root", "md5sum": "c07a10b77bc45ed4da9bb7b22d921267", "mode": "0755", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 1858, "src": "/root/.ansible/tmp/ansible-tmp-1754742128.9089127-8728-50137190355634/.source.py", "state": "file", "uid": 0 } TASK [Check if /etc/sssd exists] *********************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:28 Saturday 09 August 2025 08:22:09 -0400 (0:00:00.782) 0:00:04.207 ******* ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753978416.487, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1753978155.597, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 997, "gr_name": "sssd", "inode": 14285469, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0750", "mtime": 1753978155.331, "nlink": 4, "path": "/etc/sssd", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 31, "uid": 0, "version": "3231648934", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": false, "xusr": true } } TASK [Install sssd-common for /etc/sssd] *************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:33 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.353) 0:00:04.560 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not __sssd_dir_stat.stat.exists", "skip_reason": "Conditional result was False" } TASK [Remove realm cmd] ******************************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:44 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.015) 0:00:04.576 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"cleanup\"", "skip_reason": "Conditional result was False" } TASK [Remove sssd-common] ****************************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:49 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.016) 0:00:04.592 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"cleanup\"", "skip_reason": "Conditional result was False" } TASK [Test - Run the system role with bogus vars] ****************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:39 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.014) 0:00:04.607 ******* included: fedora.linux_system_roles.ad_integration for managed-node2 TASK [fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:3 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.075) 0:00:04.683 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not ad_integration_realm", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing timesync if timesource is set] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:8 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.015) 0:00:04.699 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_timesync_source is not none", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure time source is provided if managing timesync] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:15 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.019) 0:00:04.718 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_timesync | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing crypto policies if allow_rc4_crypto is set] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:25 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.032) 0:00:04.750 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure manage_crypt_policies is set with crypto_allow_rc4] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:30 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.032) 0:00:04.783 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure all required dns variables are provided] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:40 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.031) 0:00:04.814 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_dns | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:49 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.031) 0:00:04.846 ******* included: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.ad_integration : Ensure ansible_facts used by role] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:2 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.022) 0:00:04.868 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Check if system is ostree] **** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.033) 0:00:04.902 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set flag to indicate system is ostree] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:15 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.018) 0:00:04.921 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.018) 0:00:04.940 ******* skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_10.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_10.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_10.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_10.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.ad_integration : Ensure required packages are installed] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 Saturday 09 August 2025 08:22:10 -0400 (0:00:00.034) 0:00:04.974 ******* changed: [managed-node2] => { "changed": true, "rc": 0, "results": [ "Installed: duktape-2.7.0-10.el10.x86_64", "Installed: PackageKit-1.2.8-8.el10.x86_64", "Installed: PackageKit-glib-1.2.8-8.el10.x86_64", "Installed: appstream-1.0.2-5.el10.x86_64", "Installed: appstream-data-1:10-20250213.1.el10.noarch", "Installed: realmd-0.17.1-12.el10.x86_64", "Installed: polkit-125-3.el10.x86_64", "Installed: libxmlb-0.3.15-7.el10.x86_64", "Installed: shared-mime-info-2.3-8.el10.x86_64", "Installed: polkit-libs-125-3.el10.x86_64", "Installed: polkit-pkla-compat-0.1-30.el10.x86_64" ] } lsrpackages: PackageKit realmd TASK [fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 Saturday 09 August 2025 08:22:20 -0400 (0:00:10.089) 0:00:15.064 ******* changed: [managed-node2] => (item=realmd) => { "ansible_loop_var": "item", "changed": true, "enabled": true, "item": "realmd", "name": "realmd", "state": "started", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket dbus.socket sysinit.target basic.target system.slice", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "shutdown.target", "BindLogSockets": "no", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.freedesktop.realmd", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanLiveMount": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "CoredumpReceive": "no", "DebugInvocation": "no", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "DefaultStartupMemoryLow": "0", "Delegate": "no", "Description": "Realm and Domain Configuration", "DevicePolicy": "auto", "Documentation": "\"man:realm(8)\" \"man:realmd.conf(5)\"", "DynamicUser": "no", "EffectiveMemoryHigh": "3631226880", "EffectiveMemoryMax": "3631226880", "EffectiveTasksMax": "21831", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/libexec/realmd ; argv[]=/usr/libexec/realmd ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/libexec/realmd ; argv[]=/usr/libexec/realmd ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FileDescriptorStorePreserve": "restart", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/realmd.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "realmd.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13644", "LimitNPROCSoft": "13644", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13644", "LimitSIGPENDINGSoft": "13644", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LiveMountResult": "success", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureDurationUSec": "[not set]", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "3158847488", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryKSM": "no", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "[not set]", "MemoryPressureThresholdUSec": "200ms", "MemoryPressureWatch": "auto", "MemorySwapCurrent": "[not set]", "MemorySwapMax": "infinity", "MemorySwapPeak": "[not set]", "MemoryZSwapCurrent": "[not set]", "MemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MountAPIVFS": "no", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "realmd.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivatePIDs": "no", "PrivateTmp": "no", "PrivateTmpEx": "no", "PrivateUsers": "no", "PrivateUsersEx": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectControlGroupsEx": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "dbus.socket sysinit.target system.slice", "Restart": "no", "RestartKillSignal": "15", "RestartMaxDelayUSec": "infinity", "RestartMode": "normal", "RestartSteps": "0", "RestartUSec": "100ms", "RestartUSecNext": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RootEphemeral": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "SetLoginEnvironment": "no", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StartupMemoryHigh": "infinity", "StartupMemoryLow": "0", "StartupMemoryMax": "infinity", "StartupMemorySwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SurviveFinalKillSignal": "no", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21831", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "static", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf] **** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 Saturday 09 August 2025 08:22:21 -0400 (0:00:00.782) 0:00:15.847 ******* Notification for handler Handler for ad_integration to restart services has been saved. changed: [managed-node2] => { "changed": true, "checksum": "7e0c9eddf5cee60f782f39e0f445b043ab4bcb61", "dest": "/etc/realmd.conf", "gid": 0, "group": "root", "md5sum": "59e15d6f22a95d67b152af5a634072a8", "mode": "0400", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 181, "src": "/root/.ansible/tmp/ansible-tmp-1754742141.3370368-8891-36335417146139/.source.conf", "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.ad_integration : Flush handlers] *************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:75 Saturday 09 August 2025 08:22:21 -0400 (0:00:00.700) 0:00:16.547 ******* NOTIFIED HANDLER fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services for managed-node2 META: triggered running handlers for managed-node2 RUNNING HANDLER [fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml:3 Saturday 09 August 2025 08:22:21 -0400 (0:00:00.002) 0:00:16.550 ******* skipping: [managed-node2] => (item=realmd) => { "ansible_loop_var": "item", "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | default(false)", "item": "realmd", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Add AD server to existing network connection for DNS] ******************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:79 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.042) 0:00:16.593 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_dns | bool", "skip_reason": "Conditional result was False" } TASK [Manage timesync] ********************************************************* task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:93 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.041) 0:00:16.634 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_timesync | bool", "skip_reason": "Conditional result was False" } TASK [Manage crypto policies] ************************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:102 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.034) 0:00:16.669 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_crypto_policies | bool", "skip_reason": "Conditional result was False" } TASK [Enable crypto policy allowing RC4 encryption] **************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:114 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.033) 0:00:16.702 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists] ****** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:130 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.030) 0:00:16.733 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_sssd_merge_duplicate_sections | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:135 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.031) 0:00:16.765 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set variables we will need for merging - 1] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:142 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.030) 0:00:16.795 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set variables we will need for merging - 2] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:147 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.031) 0:00:16.827 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Check if we are already joined to a domain] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:153 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.031) 0:00:16.859 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_force_rejoin | bool or __ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Leave existing joined domain] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:161 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.033) 0:00:16.892 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_force_rejoin | bool or __ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm leave] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:174 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.032) 0:00:16.924 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Remove duplicate sections] **** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:179 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.033) 0:00:16.958 ******* skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Build Command - Join to a specific Domain Controller] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:191 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.061) 0:00:17.019 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Build Join Command - Perform discovery-based realm join operation] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:205 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.018) 0:00:17.038 ******* ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Show the join command for debug] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:219 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.042) 0:00:17.081 ******* skipping: [managed-node2] => { "false_condition": "ad_integration_join_to_dc == __ad_integration_sample_dc or ad_integration_realm == __ad_integration_sample_realm or ansible_check_mode" } TASK [fedora.linux_system_roles.ad_integration : Run realm join command] ******* task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:231 Saturday 09 August 2025 08:22:22 -0400 (0:00:00.020) 0:00:17.101 ******* changed: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm join] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:258 Saturday 09 August 2025 08:22:23 -0400 (0:00:00.550) 0:00:17.652 ******* ok: [managed-node2] => { "changed": false, "stat": { "atime": 1754742143.0260122, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "049911c7517fba993eeb39dc494de8bf33faa685", "ctime": 1754742143.0250123, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 12784492, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1754742143.0250123, "nlink": 1, "path": "/etc/sssd/sssd.conf", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 87, "uid": 0, "version": "2136357593", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists after realm join] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:263 Saturday 09 August 2025 08:22:23 -0400 (0:00:00.485) 0:00:18.137 ******* ok: [managed-node2] => { "changed": false, "content": "W2RvbWFpbi9keW5kbnMtc2FtcGxlLXJlYWxtLmNvbV0KYWRfZG9tYWluID0gZHluZG5zLXNhbXBsZS1yZWFsbS5jb20KaWRfcHJvdmlkZXIgPSBhZAoK", "encoding": "base64", "source": "/etc/sssd/sssd.conf" } TASK [fedora.linux_system_roles.ad_integration : Consolidate options from duplicate sections] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:269 Saturday 09 August 2025 08:22:24 -0400 (0:00:00.499) 0:00:18.637 ******* skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Configure SSSD settings] ****** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:284 Saturday 09 August 2025 08:22:24 -0400 (0:00:00.034) 0:00:18.671 ******* skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 Saturday 09 August 2025 08:22:24 -0400 (0:00:00.012) 0:00:18.683 ******* changed: [managed-node2] => (item={'key': 'dyndns_update', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_update", "value": "True" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 108, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_ttl', 'value': '3600'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_ttl", "value": "3600" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 126, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_iface', 'value': 'TESTING'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_iface", "value": "TESTING" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 149, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_refresh_interval', 'value': '86400'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_refresh_interval", "value": "86400" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 181, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_update_ptr', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_update_ptr", "value": "True" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 206, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_force_tcp', 'value': 'False'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_force_tcp", "value": "False" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 231, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_auth', 'value': 'GSS-TSIG'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_auth", "value": "GSS-TSIG" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 254, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_server', 'value': '127.0.0.1'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_server", "value": "127.0.0.1" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 280, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'ad_hostname', 'value': 'managed-node2.dyndns-sample-realm.com'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "ad_hostname", "value": "managed-node2.dyndns-sample-realm.com" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: option added Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. TASK [fedora.linux_system_roles.ad_integration : Configure custom SSSD settings] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:336 Saturday 09 August 2025 08:22:27 -0400 (0:00:03.527) 0:00:22.211 ******* skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Cleanup dynamic DNS configuration options] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:353 Saturday 09 August 2025 08:22:27 -0400 (0:00:00.013) 0:00:22.225 ******* skipping: [managed-node2] => (item={'key': 'dyndns_iface', 'value': 'TESTING'}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item.value is none or item.value == ''", "item": { "key": "dyndns_iface", "value": "TESTING" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item={'key': 'dyndns_server', 'value': '127.0.0.1'}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item.value is none or item.value == ''", "item": { "key": "dyndns_server", "value": "127.0.0.1" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Check custom dyndns settings] ******************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:49 Saturday 09 August 2025 08:22:27 -0400 (0:00:00.060) 0:00:22.285 ******* ok: [managed-node2] => (item={'key': 'dyndns_update', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_update", "value": "True" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_iface', 'value': 'TESTING'}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_iface", "value": "TESTING" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_auth', 'value': 'GSS-TSIG'}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_auth", "value": "GSS-TSIG" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_server', 'value': '127.0.0.1'}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_server", "value": "127.0.0.1" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK TASK [Search /var/log/sssd/sssd.log for [sss_ini_call_validators]] ************* task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:72 Saturday 09 August 2025 08:22:29 -0400 (0:00:01.410) 0:00:23.696 ******* ok: [managed-node2] => { "changed": false, "cmd": [ "grep", "-i", "sss_ini_call_validators", "/var/log/sssd/sssd.log" ], "delta": "0:00:00.003376", "end": "2025-08-09 08:22:29.447469", "failed_when_result": false, "rc": 2, "start": "2025-08-09 08:22:29.444093" } STDERR: grep: /var/log/sssd/sssd.log: No such file or directory MSG: non-zero return code TASK [Fail if signature found] ************************************************* task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:78 Saturday 09 August 2025 08:22:29 -0400 (0:00:00.356) 0:00:24.053 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "sssd_log.stdout | length > 0", "skip_reason": "Conditional result was False" } TASK [Test - Re-run the system role to remove vars] **************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:83 Saturday 09 August 2025 08:22:29 -0400 (0:00:00.031) 0:00:24.084 ******* included: fedora.linux_system_roles.ad_integration for managed-node2 TASK [fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:3 Saturday 09 August 2025 08:22:29 -0400 (0:00:00.072) 0:00:24.157 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not ad_integration_realm", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing timesync if timesource is set] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:8 Saturday 09 August 2025 08:22:29 -0400 (0:00:00.013) 0:00:24.170 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_timesync_source is not none", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure time source is provided if managing timesync] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:15 Saturday 09 August 2025 08:22:29 -0400 (0:00:00.018) 0:00:24.188 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_timesync | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing crypto policies if allow_rc4_crypto is set] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:25 Saturday 09 August 2025 08:22:29 -0400 (0:00:00.031) 0:00:24.220 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure manage_crypt_policies is set with crypto_allow_rc4] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:30 Saturday 09 August 2025 08:22:29 -0400 (0:00:00.030) 0:00:24.250 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure all required dns variables are provided] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:40 Saturday 09 August 2025 08:22:29 -0400 (0:00:00.030) 0:00:24.280 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_dns | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:49 Saturday 09 August 2025 08:22:29 -0400 (0:00:00.029) 0:00:24.310 ******* included: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.ad_integration : Ensure ansible_facts used by role] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:2 Saturday 09 August 2025 08:22:29 -0400 (0:00:00.021) 0:00:24.332 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Check if system is ostree] **** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Saturday 09 August 2025 08:22:29 -0400 (0:00:00.033) 0:00:24.365 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set flag to indicate system is ostree] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:15 Saturday 09 August 2025 08:22:29 -0400 (0:00:00.021) 0:00:24.387 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 Saturday 09 August 2025 08:22:29 -0400 (0:00:00.030) 0:00:24.417 ******* skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_10.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_10.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_10.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_10.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.ad_integration : Ensure required packages are installed] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 Saturday 09 August 2025 08:22:29 -0400 (0:00:00.034) 0:00:24.451 ******* ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: PackageKit realmd TASK [fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 Saturday 09 August 2025 08:22:30 -0400 (0:00:00.776) 0:00:25.228 ******* ok: [managed-node2] => (item=realmd) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "realmd", "name": "realmd", "state": "started", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestamp": "Sat 2025-08-09 08:22:21 EDT", "ActiveEnterTimestampMonotonic": "188808514", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "sysinit.target system.slice basic.target systemd-journald.socket dbus.socket", "AllowIsolate": "no", "AssertResult": "yes", "AssertTimestamp": "Sat 2025-08-09 08:22:21 EDT", "AssertTimestampMonotonic": "188789707", "Before": "shutdown.target", "BindLogSockets": "no", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.freedesktop.realmd", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "13589000", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanLiveMount": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Sat 2025-08-09 08:22:21 EDT", "ConditionTimestampMonotonic": "188789701", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroup": "/system.slice/realmd.service", "ControlGroupId": "4700", "ControlPID": "0", "CoredumpFilter": "0x33", "CoredumpReceive": "no", "DebugInvocation": "no", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "DefaultStartupMemoryLow": "0", "Delegate": "no", "Description": "Realm and Domain Configuration", "DevicePolicy": "auto", "Documentation": "\"man:realm(8)\" \"man:realmd.conf(5)\"", "DynamicUser": "no", "EffectiveMemoryHigh": "3631226880", "EffectiveMemoryMax": "3631226880", "EffectiveTasksMax": "21831", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Sat 2025-08-09 08:22:21 EDT", "ExecMainHandoffTimestampMonotonic": "188799274", "ExecMainPID": "10045", "ExecMainStartTimestamp": "Sat 2025-08-09 08:22:21 EDT", "ExecMainStartTimestampMonotonic": "188790540", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/libexec/realmd ; argv[]=/usr/libexec/realmd ; ignore_errors=no ; start_time=[Sat 2025-08-09 08:22:21 EDT] ; stop_time=[n/a] ; pid=10045 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/libexec/realmd ; argv[]=/usr/libexec/realmd ; flags= ; start_time=[Sat 2025-08-09 08:22:21 EDT] ; stop_time=[n/a] ; pid=10045 ; code=(null) ; status=0/0 }", "ExitType": "main", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FileDescriptorStorePreserve": "restart", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/realmd.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "realmd.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Sat 2025-08-09 08:22:21 EDT", "InactiveExitTimestampMonotonic": "188791058", "InvocationID": "b86f34a4e6974a82aabc2afae9d99924", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13644", "LimitNPROCSoft": "13644", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13644", "LimitSIGPENDINGSoft": "13644", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LiveMountResult": "success", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "10045", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureDurationUSec": "[not set]", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "3187621888", "MemoryCurrent": "1732608", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryKSM": "no", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "2256896", "MemoryPressureThresholdUSec": "200ms", "MemoryPressureWatch": "auto", "MemorySwapCurrent": "0", "MemorySwapMax": "infinity", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MountAPIVFS": "no", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "realmd.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivatePIDs": "no", "PrivateTmp": "no", "PrivateTmpEx": "no", "PrivateUsers": "no", "PrivateUsersEx": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectControlGroupsEx": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target system.slice dbus.socket", "Restart": "no", "RestartKillSignal": "15", "RestartMaxDelayUSec": "infinity", "RestartMode": "normal", "RestartSteps": "0", "RestartUSec": "100ms", "RestartUSecNext": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RootEphemeral": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "SetLoginEnvironment": "no", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StartupMemoryHigh": "infinity", "StartupMemoryLow": "0", "StartupMemoryMax": "infinity", "StartupMemorySwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "StateChangeTimestamp": "Sat 2025-08-09 08:22:21 EDT", "StateChangeTimestampMonotonic": "188808514", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SurviveFinalKillSignal": "no", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "5", "TasksMax": "21831", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "static", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf] **** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 Saturday 09 August 2025 08:22:31 -0400 (0:00:00.550) 0:00:25.779 ******* ok: [managed-node2] => { "changed": false, "checksum": "7e0c9eddf5cee60f782f39e0f445b043ab4bcb61", "dest": "/etc/realmd.conf", "gid": 0, "group": "root", "mode": "0400", "owner": "root", "path": "/etc/realmd.conf", "secontext": "system_u:object_r:etc_t:s0", "size": 181, "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.ad_integration : Flush handlers] *************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:75 Saturday 09 August 2025 08:22:31 -0400 (0:00:00.730) 0:00:26.509 ******* NOTIFIED HANDLER fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services - sssd for managed-node2 META: triggered running handlers for managed-node2 RUNNING HANDLER [fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services - sssd] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml:10 Saturday 09 August 2025 08:22:31 -0400 (0:00:00.002) 0:00:26.512 ******* skipping: [managed-node2] => (item=sssd) => { "ansible_loop_var": "item", "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | default(false)", "item": "sssd", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Add AD server to existing network connection for DNS] ******************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:79 Saturday 09 August 2025 08:22:31 -0400 (0:00:00.038) 0:00:26.550 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_dns | bool", "skip_reason": "Conditional result was False" } TASK [Manage timesync] ********************************************************* task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:93 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.036) 0:00:26.587 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_timesync | bool", "skip_reason": "Conditional result was False" } TASK [Manage crypto policies] ************************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:102 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.030) 0:00:26.618 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_crypto_policies | bool", "skip_reason": "Conditional result was False" } TASK [Enable crypto policy allowing RC4 encryption] **************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:114 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.029) 0:00:26.648 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists] ****** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:130 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.029) 0:00:26.677 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_sssd_merge_duplicate_sections | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:135 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.029) 0:00:26.707 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set variables we will need for merging - 1] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:142 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.029) 0:00:26.736 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set variables we will need for merging - 2] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:147 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.027) 0:00:26.764 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Check if we are already joined to a domain] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:153 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.028) 0:00:26.793 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_force_rejoin | bool or __ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Leave existing joined domain] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:161 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.030) 0:00:26.823 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_force_rejoin | bool or __ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm leave] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:174 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.031) 0:00:26.854 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Remove duplicate sections] **** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:179 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.032) 0:00:26.887 ******* skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Build Command - Join to a specific Domain Controller] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:191 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.045) 0:00:26.932 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Build Join Command - Perform discovery-based realm join operation] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:205 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.026) 0:00:26.959 ******* ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Show the join command for debug] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:219 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.044) 0:00:27.003 ******* skipping: [managed-node2] => { "false_condition": "ad_integration_join_to_dc == __ad_integration_sample_dc or ad_integration_realm == __ad_integration_sample_realm or ansible_check_mode" } TASK [fedora.linux_system_roles.ad_integration : Run realm join command] ******* task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:231 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.013) 0:00:27.016 ******* changed: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm join] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:258 Saturday 09 August 2025 08:22:32 -0400 (0:00:00.389) 0:00:27.406 ******* ok: [managed-node2] => { "changed": false, "stat": { "atime": 1754742147.5990503, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f1525f35c7b671e667311b43498d537dc0b75297", "ctime": 1754742147.5970502, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 612368580, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1754742147.598244, "nlink": 1, "path": "/etc/sssd/sssd.conf", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 332, "uid": 0, "version": "2133760234", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists after realm join] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:263 Saturday 09 August 2025 08:22:33 -0400 (0:00:00.410) 0:00:27.816 ******* ok: [managed-node2] => { "changed": false, "content": "W2RvbWFpbi9keW5kbnMtc2FtcGxlLXJlYWxtLmNvbV0KYWRfZG9tYWluID0gZHluZG5zLXNhbXBsZS1yZWFsbS5jb20KaWRfcHJvdmlkZXIgPSBhZApkeW5kbnNfdXBkYXRlID0gVHJ1ZQpkeW5kbnNfdHRsID0gMzYwMApkeW5kbnNfaWZhY2UgPSBURVNUSU5HCmR5bmRuc19yZWZyZXNoX2ludGVydmFsID0gODY0MDAKZHluZG5zX3VwZGF0ZV9wdHIgPSBUcnVlCmR5bmRuc19mb3JjZV90Y3AgPSBGYWxzZQpkeW5kbnNfYXV0aCA9IEdTUy1UU0lHCmR5bmRuc19zZXJ2ZXIgPSAxMjcuMC4wLjEKYWRfaG9zdG5hbWUgPSBtYW5hZ2VkLW5vZGUyLmR5bmRucy1zYW1wbGUtcmVhbG0uY29tCgo=", "encoding": "base64", "source": "/etc/sssd/sssd.conf" } TASK [fedora.linux_system_roles.ad_integration : Consolidate options from duplicate sections] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:269 Saturday 09 August 2025 08:22:33 -0400 (0:00:00.367) 0:00:28.184 ******* skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Configure SSSD settings] ****** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:284 Saturday 09 August 2025 08:22:33 -0400 (0:00:00.037) 0:00:28.221 ******* skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 Saturday 09 August 2025 08:22:33 -0400 (0:00:00.012) 0:00:28.234 ******* ok: [managed-node2] => (item={'key': 'dyndns_update', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_update", "value": "True" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_ttl', 'value': '3600'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_ttl", "value": "3600" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK skipping: [managed-node2] => (item={'key': 'dyndns_iface', 'value': ''}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item.value != ''", "item": { "key": "dyndns_iface", "value": "" }, "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item={'key': 'dyndns_refresh_interval', 'value': '86400'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_refresh_interval", "value": "86400" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_update_ptr', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_update_ptr", "value": "True" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_force_tcp', 'value': 'False'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_force_tcp", "value": "False" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK skipping: [managed-node2] => (item={'key': 'dyndns_auth', 'value': ''}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item.value != ''", "item": { "key": "dyndns_auth", "value": "" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item={'key': 'dyndns_server', 'value': ''}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item.value != ''", "item": { "key": "dyndns_server", "value": "" }, "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item={'key': 'ad_hostname', 'value': 'managed-node2.dyndns-sample-realm.com'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "ad_hostname", "value": "managed-node2.dyndns-sample-realm.com" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK TASK [fedora.linux_system_roles.ad_integration : Configure custom SSSD settings] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:336 Saturday 09 August 2025 08:22:35 -0400 (0:00:02.278) 0:00:30.512 ******* skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Cleanup dynamic DNS configuration options] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:353 Saturday 09 August 2025 08:22:35 -0400 (0:00:00.013) 0:00:30.526 ******* changed: [managed-node2] => (item={'key': 'dyndns_iface', 'value': ''}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_iface", "value": "" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 309, "state": "file", "uid": 0 } MSG: option changed changed: [managed-node2] => (item={'key': 'dyndns_server', 'value': ''}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_server", "value": "" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 283, "state": "file", "uid": 0 } MSG: option changed Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. TASK [Restart sssd] ************************************************************ task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:91 Saturday 09 August 2025 08:22:36 -0400 (0:00:00.792) 0:00:31.318 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | d(false)", "skip_reason": "Conditional result was False" } TASK [Check custom dyndns settings are removed] ******************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:97 Saturday 09 August 2025 08:22:36 -0400 (0:00:00.065) 0:00:31.384 ******* ok: [managed-node2] => (item={'key': 'dyndns_iface', 'value': None}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_iface", "value": null }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 283, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_server', 'value': None}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_server", "value": null }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 283, "state": "file", "uid": 0 } MSG: OK TASK [Gather facts] ************************************************************ task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:118 Saturday 09 August 2025 08:22:37 -0400 (0:00:00.749) 0:00:32.134 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | d(false)", "skip_reason": "Conditional result was False" } TASK [Get IP for host's FQDN] ************************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:124 Saturday 09 August 2025 08:22:37 -0400 (0:00:00.055) 0:00:32.189 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | d(false)", "skip_reason": "Conditional result was False" } TASK [Get hostname for host's IP address] ************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:130 Saturday 09 August 2025 08:22:37 -0400 (0:00:00.056) 0:00:32.245 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | d(false)", "skip_reason": "Conditional result was False" } TASK [Assert IPv4 DNS records were created] ************************************ task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:136 Saturday 09 August 2025 08:22:37 -0400 (0:00:00.057) 0:00:32.303 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | d(false)", "skip_reason": "Conditional result was False" } TASK [Cleanup fake realm] ****************************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:144 Saturday 09 August 2025 08:22:37 -0400 (0:00:00.061) 0:00:32.365 ******* included: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml for managed-node2 TASK [Get role variables] ****************************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:6 Saturday 09 August 2025 08:22:37 -0400 (0:00:00.120) 0:00:32.485 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Create a temp file for fake realm cmd] *********************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:12 Saturday 09 August 2025 08:22:37 -0400 (0:00:00.022) 0:00:32.508 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Set realm cmd variable for remainder of test] **************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:18 Saturday 09 August 2025 08:22:37 -0400 (0:00:00.024) 0:00:32.532 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Create fake realm cmd] *************************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:22 Saturday 09 August 2025 08:22:37 -0400 (0:00:00.022) 0:00:32.555 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Check if /etc/sssd exists] *********************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:28 Saturday 09 August 2025 08:22:38 -0400 (0:00:00.022) 0:00:32.577 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Install sssd-common for /etc/sssd] *************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:33 Saturday 09 August 2025 08:22:38 -0400 (0:00:00.021) 0:00:32.599 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Remove realm cmd] ******************************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:44 Saturday 09 August 2025 08:22:38 -0400 (0:00:00.022) 0:00:32.622 ******* changed: [managed-node2] => { "changed": true, "path": "/tmp/lsr_m78aggly_ad_int_realm.py", "state": "absent" } TASK [Remove sssd-common] ****************************************************** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:49 Saturday 09 August 2025 08:22:38 -0400 (0:00:00.388) 0:00:33.010 ******* skipping: [managed-node2] => { "changed": false, "false_condition": "__installed_sssd_package is changed", "skip_reason": "Conditional result was False" } NOTIFIED HANDLER fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services - sssd for managed-node2 RUNNING HANDLER [fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services - sssd] *** task path: /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml:10 Saturday 09 August 2025 08:22:38 -0400 (0:00:00.034) 0:00:33.045 ******* skipping: [managed-node2] => (item=sssd) => { "ansible_loop_var": "item", "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | default(false)", "item": "sssd", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped PLAY RECAP ********************************************************************* managed-node2 : ok=35 changed=10 unreachable=0 failed=0 skipped=76 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Saturday 09 August 2025 08:22:38 -0400 (0:00:00.049) 0:00:33.094 ******* =============================================================================== fedora.linux_system_roles.ad_integration : Ensure required packages are installed -- 10.09s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates --- 3.53s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 Gathering Facts --------------------------------------------------------- 2.35s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:20 fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates --- 2.28s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 Check custom dyndns settings -------------------------------------------- 1.41s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:49 fedora.linux_system_roles.ad_integration : Cleanup dynamic DNS configuration options --- 0.79s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:353 fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started --- 0.78s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 Create fake realm cmd --------------------------------------------------- 0.78s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:22 fedora.linux_system_roles.ad_integration : Ensure required packages are installed --- 0.78s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 Check custom dyndns settings are removed -------------------------------- 0.75s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:97 fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf ---- 0.73s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf ---- 0.70s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 fedora.linux_system_roles.ad_integration : Run realm join command ------- 0.55s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:231 fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started --- 0.55s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists after realm join --- 0.50s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:263 fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm join --- 0.49s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:258 fedora.linux_system_roles.ad_integration : Check if system is ostree ---- 0.44s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Create a temp file for fake realm cmd ----------------------------------- 0.42s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:12 fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm join --- 0.41s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:258 fedora.linux_system_roles.ad_integration : Run realm join command ------- 0.39s /tmp/collections-HVo/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:231