ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_dyndns.yml ***************************************************** 1 plays in /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml [WARNING]: Could not match supplied host pattern, ignoring: ad PLAY [Ensure that the role configures dynamic dns] ***************************** META: ran handlers TASK [Test - Run the system role with bogus vars] ****************************** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:31 Friday 20 June 2025 14:50:29 -0400 (0:00:00.020) 0:00:00.020 *********** TASK [fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:3 Friday 20 June 2025 14:50:29 -0400 (0:00:00.069) 0:00:00.089 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing timesync if timesource is set] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:8 Friday 20 June 2025 14:50:29 -0400 (0:00:00.032) 0:00:00.121 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure time source is provided if managing timesync] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:15 Friday 20 June 2025 14:50:29 -0400 (0:00:00.030) 0:00:00.152 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing crypto policies if allow_rc4_crypto is set] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:25 Friday 20 June 2025 14:50:29 -0400 (0:00:00.030) 0:00:00.183 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure manage_crypt_policies is set with crypto_allow_rc4] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:30 Friday 20 June 2025 14:50:29 -0400 (0:00:00.030) 0:00:00.213 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure all required dns variables are provided] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:40 Friday 20 June 2025 14:50:29 -0400 (0:00:00.030) 0:00:00.244 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:49 Friday 20 June 2025 14:50:29 -0400 (0:00:00.029) 0:00:00.273 *********** included: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.ad_integration : Ensure ansible_facts used by role] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:2 Friday 20 June 2025 14:50:29 -0400 (0:00:00.020) 0:00:00.294 *********** ok: [managed-node2] TASK [fedora.linux_system_roles.ad_integration : Check if system is ostree] **** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Friday 20 June 2025 14:50:30 -0400 (0:00:00.863) 0:00:01.157 *********** ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.ad_integration : Set flag to indicate system is ostree] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:15 Friday 20 June 2025 14:50:31 -0400 (0:00:00.415) 0:00:01.573 *********** ok: [managed-node2] => { "ansible_facts": { "__ad_integration_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 Friday 20 June 2025 14:50:31 -0400 (0:00:00.043) 0:00:01.617 *********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure required packages are installed] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 Friday 20 June 2025 14:50:31 -0400 (0:00:00.050) 0:00:01.668 *********** changed: [managed-node2] => { "changed": true, "rc": 0, "results": [ "Installed: dejavu-sans-mono-fonts-2.35-7.el8.noarch", "Installed: gsettings-desktop-schemas-3.32.0-6.el8.x86_64", "Installed: realmd-0.17.1-2.el8.x86_64", "Installed: PackageKit-glib-1.1.12-7.el8.x86_64", "Installed: abattis-cantarell-fonts-0.0.25-6.el8.noarch", "Installed: libproxy-0.4.15-5.2.el8.x86_64", "Installed: gdk-pixbuf2-2.36.12-5.el8.x86_64", "Installed: libstemmer-0-10.585svn.el8.x86_64", "Installed: glib-networking-2.56.1-1.1.el8.x86_64", "Installed: libmodman-2.0.1-17.el8.x86_64", "Installed: json-glib-1.4.4-1.el8.x86_64", "Installed: libappstream-glib-0.7.14-3.el8.x86_64", "Installed: fontpackages-filesystem-1.44-22.el8.noarch", "Installed: libsoup-2.62.3-5.el8.x86_64", "Installed: PackageKit-1.1.12-7.el8.x86_64", "Installed: dejavu-fonts-common-2.35-7.el8.noarch" ] } TASK [fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 Friday 20 June 2025 14:50:48 -0400 (0:00:17.652) 0:00:19.321 *********** changed: [managed-node2] => (item=realmd) => { "ansible_loop_var": "item", "changed": true, "enabled": true, "item": "realmd", "name": "realmd", "state": "started", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket sysinit.target basic.target dbus.socket system.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.freedesktop.realmd", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Realm and Domain Configuration", "DevicePolicy": "auto", "Documentation": "man:realm(8) man:realmd.conf(5)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/libexec/realmd ; argv[]=/usr/libexec/realmd ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/realmd.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "realmd.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "realmd.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.socket sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "static", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf] **** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 Friday 20 June 2025 14:50:49 -0400 (0:00:01.095) 0:00:20.416 *********** NOTIFIED HANDLER fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services for managed-node2 changed: [managed-node2] => { "changed": true, "checksum": "0fe3f55e582b3bf7fd00d6665b2e8dcfd0f0d645", "dest": "/etc/realmd.conf", "gid": 0, "group": "root", "md5sum": "c03fe7d9bbd0f743ca4fbe88f5320506", "mode": "0400", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 174, "src": "/root/.ansible/tmp/ansible-tmp-1750445449.9682064-8373-180648100458514/source", "state": "file", "uid": 0 } RUNNING HANDLER [fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml:3 Friday 20 June 2025 14:50:50 -0400 (0:00:00.754) 0:00:21.171 *********** skipping: [managed-node2] => (item=realmd) => { "ansible_loop_var": "item", "changed": false, "item": "realmd", "skip_reason": "Conditional result was False" } META: ran handlers TASK [Add AD server to existing network connection for DNS] ******************** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:79 Friday 20 June 2025 14:50:50 -0400 (0:00:00.032) 0:00:21.203 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Manage timesync] ********************************************************* task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:93 Friday 20 June 2025 14:50:50 -0400 (0:00:00.020) 0:00:21.223 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Manage crypto policies] ************************************************** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:102 Friday 20 June 2025 14:50:50 -0400 (0:00:00.018) 0:00:21.242 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable crypto policy allowing RC4 encryption] **************************** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:114 Friday 20 June 2025 14:50:50 -0400 (0:00:00.012) 0:00:21.255 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Check if we are already joined to a domain] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:125 Friday 20 June 2025 14:50:50 -0400 (0:00:00.017) 0:00:21.272 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Leave existing joined domain] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:131 Friday 20 June 2025 14:50:50 -0400 (0:00:00.013) 0:00:21.286 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Build Command - Join to a specific Domain Controller] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:139 Friday 20 June 2025 14:50:50 -0400 (0:00:00.014) 0:00:21.300 *********** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Build Join Command - Perform discovery-based realm join operation] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:153 Friday 20 June 2025 14:50:50 -0400 (0:00:00.025) 0:00:21.326 *********** ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Show the join command for debug] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:167 Friday 20 June 2025 14:50:50 -0400 (0:00:00.033) 0:00:21.360 *********** ok: [managed-node2] => {} MSG: ['Would run the following command. Note that ad_integration_join_parameters have been removed for security purposes, the role will pass them to the actual realm join command when running without check mode.', 'realm join -U Administrator --membership-software adcli sample-realm.com'] TASK [fedora.linux_system_roles.ad_integration : Run realm join command] ******* task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:179 Friday 20 June 2025 14:50:50 -0400 (0:00:00.029) 0:00:21.389 *********** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists] ****** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:208 Friday 20 June 2025 14:50:50 -0400 (0:00:00.019) 0:00:21.409 *********** ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:213 Friday 20 June 2025 14:50:51 -0400 (0:00:00.322) 0:00:21.731 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Consolidate options from duplicate sections] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:219 Friday 20 June 2025 14:50:51 -0400 (0:00:00.023) 0:00:21.754 *********** TASK [fedora.linux_system_roles.ad_integration : Remove duplicate sections] **** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:234 Friday 20 June 2025 14:50:51 -0400 (0:00:00.049) 0:00:21.804 *********** TASK [fedora.linux_system_roles.ad_integration : Configure SSSD settings] ****** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:246 Friday 20 June 2025 14:50:51 -0400 (0:00:00.035) 0:00:21.839 *********** TASK [fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:260 Friday 20 June 2025 14:50:51 -0400 (0:00:00.011) 0:00:21.850 *********** changed: [managed-node2] => (item={'key': 'dyndns_update', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_update", "value": "True" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 48, "state": "file", "uid": 0, "warnings": [ "The value True (type bool) in a string field was converted to 'True' (type string). If this does not look like what you expect, quote the entire value to ensure it does not change." ] } MSG: section and option added changed: [managed-node2] => (item={'key': 'dyndns_ttl', 'value': '3600'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_ttl", "value": "3600" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 66, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_iface', 'value': 'TESTING'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_iface", "value": "TESTING" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 89, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_refresh_interval', 'value': '86400'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_refresh_interval", "value": "86400" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 121, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_update_ptr', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_update_ptr", "value": "True" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 146, "state": "file", "uid": 0, "warnings": [ "The value True (type bool) in a string field was converted to 'True' (type string). If this does not look like what you expect, quote the entire value to ensure it does not change." ] } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_force_tcp', 'value': 'False'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_force_tcp", "value": "False" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 171, "state": "file", "uid": 0, "warnings": [ "The value False (type bool) in a string field was converted to 'False' (type string). If this does not look like what you expect, quote the entire value to ensure it does not change." ] } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_auth', 'value': 'GSS-TSIG'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_auth", "value": "GSS-TSIG" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 194, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_server', 'value': '127.0.0.1'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_server", "value": "127.0.0.1" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 220, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'ad_hostname', 'value': 'managed-node2.sample-realm.com'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "ad_hostname", "value": "managed-node2.sample-realm.com" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 265, "state": "file", "uid": 0 } MSG: option added NOTIFIED HANDLER fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services - sssd for managed-node2 [WARNING]: The value True (type bool) in a string field was converted to 'True' (type string). If this does not look like what you expect, quote the entire value to ensure it does not change. [WARNING]: The value False (type bool) in a string field was converted to 'False' (type string). If this does not look like what you expect, quote the entire value to ensure it does not change. TASK [fedora.linux_system_roles.ad_integration : Configure custom SSSD settings] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 Friday 20 June 2025 14:50:54 -0400 (0:00:03.263) 0:00:25.114 *********** TASK [fedora.linux_system_roles.ad_integration : Cleanup dynamic DNS configuration options] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:315 Friday 20 June 2025 14:50:54 -0400 (0:00:00.020) 0:00:25.135 *********** skipping: [managed-node2] => (item={'key': 'dyndns_iface', 'value': 'TESTING'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_iface", "value": "TESTING" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item={'key': 'dyndns_server', 'value': '127.0.0.1'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_server", "value": "127.0.0.1" }, "skip_reason": "Conditional result was False" } TASK [Check custom dyndns settings] ******************************************** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:41 Friday 20 June 2025 14:50:54 -0400 (0:00:00.045) 0:00:25.180 *********** ok: [managed-node2] => (item={'key': 'dyndns_update', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_update", "value": "True" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 265, "state": "file", "uid": 0, "warnings": [ "The value True (type bool) in a string field was converted to 'True' (type string). If this does not look like what you expect, quote the entire value to ensure it does not change." ] } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_iface', 'value': 'TESTING'}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_iface", "value": "TESTING" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 265, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_auth', 'value': 'GSS-TSIG'}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_auth", "value": "GSS-TSIG" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 265, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_server', 'value': '127.0.0.1'}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_server", "value": "127.0.0.1" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 265, "state": "file", "uid": 0 } MSG: OK TASK [Search /var/log/sssd/sssd.log for [sss_ini_call_validators]] ************* task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:64 Friday 20 June 2025 14:50:56 -0400 (0:00:01.318) 0:00:26.498 *********** ok: [managed-node2] => { "changed": false, "cmd": [ "grep", "-i", "sss_ini_call_validators", "/var/log/sssd/sssd.log" ], "delta": "0:00:00.002809", "end": "2025-06-20 14:50:56.372962", "failed_when_result": false, "rc": 2, "start": "2025-06-20 14:50:56.370153" } STDERR: grep: /var/log/sssd/sssd.log: No such file or directory MSG: non-zero return code TASK [Fail if signature found] ************************************************* task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:70 Friday 20 June 2025 14:50:56 -0400 (0:00:00.400) 0:00:26.899 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Test - Re-run the system role to remove vars] **************************** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:75 Friday 20 June 2025 14:50:56 -0400 (0:00:00.014) 0:00:26.913 *********** TASK [fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:3 Friday 20 June 2025 14:50:56 -0400 (0:00:00.062) 0:00:26.976 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing timesync if timesource is set] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:8 Friday 20 June 2025 14:50:56 -0400 (0:00:00.021) 0:00:26.997 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure time source is provided if managing timesync] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:15 Friday 20 June 2025 14:50:56 -0400 (0:00:00.022) 0:00:27.019 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing crypto policies if allow_rc4_crypto is set] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:25 Friday 20 June 2025 14:50:56 -0400 (0:00:00.020) 0:00:27.040 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure manage_crypt_policies is set with crypto_allow_rc4] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:30 Friday 20 June 2025 14:50:56 -0400 (0:00:00.019) 0:00:27.060 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure all required dns variables are provided] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:40 Friday 20 June 2025 14:50:56 -0400 (0:00:00.020) 0:00:27.080 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:49 Friday 20 June 2025 14:50:56 -0400 (0:00:00.018) 0:00:27.099 *********** included: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.ad_integration : Ensure ansible_facts used by role] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:2 Friday 20 June 2025 14:50:56 -0400 (0:00:00.033) 0:00:27.133 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Check if system is ostree] **** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Friday 20 June 2025 14:50:56 -0400 (0:00:00.021) 0:00:27.154 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set flag to indicate system is ostree] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:15 Friday 20 June 2025 14:50:56 -0400 (0:00:00.018) 0:00:27.173 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 Friday 20 June 2025 14:50:56 -0400 (0:00:00.019) 0:00:27.193 *********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure required packages are installed] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 Friday 20 June 2025 14:50:56 -0400 (0:00:00.048) 0:00:27.241 *********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 Friday 20 June 2025 14:50:59 -0400 (0:00:03.162) 0:00:30.403 *********** ok: [managed-node2] => (item=realmd) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "realmd", "name": "realmd", "state": "started", "status": { "ActiveEnterTimestamp": "Fri 2025-06-20 14:50:49 EDT", "ActiveEnterTimestampMonotonic": "221723683", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "systemd-journald.socket sysinit.target dbus.socket system.slice basic.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Fri 2025-06-20 14:50:49 EDT", "AssertTimestampMonotonic": "221695593", "Before": "shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.freedesktop.realmd", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Fri 2025-06-20 14:50:49 EDT", "ConditionTimestampMonotonic": "221695592", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroup": "/system.slice/realmd.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Realm and Domain Configuration", "DevicePolicy": "auto", "Documentation": "man:realm(8) man:realmd.conf(5)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8407", "ExecMainStartTimestamp": "Fri 2025-06-20 14:50:49 EDT", "ExecMainStartTimestampMonotonic": "221703808", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/libexec/realmd ; argv[]=/usr/libexec/realmd ; ignore_errors=no ; start_time=[Fri 2025-06-20 14:50:49 EDT] ; stop_time=[n/a] ; pid=8407 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/realmd.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "realmd.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Fri 2025-06-20 14:50:49 EDT", "InactiveExitTimestampMonotonic": "221703850", "InvocationID": "275918de2dc24569b80906dfc27efc8d", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "8407", "MemoryAccounting": "yes", "MemoryCurrent": "4202496", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "realmd.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target dbus.socket system.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2025-06-20 14:50:49 EDT", "StateChangeTimestampMonotonic": "221723683", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "static", "UtmpMode": "init", "WatchdogTimestamp": "Fri 2025-06-20 14:50:49 EDT", "WatchdogTimestampMonotonic": "221723680", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf] **** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 Friday 20 June 2025 14:51:00 -0400 (0:00:00.793) 0:00:31.196 *********** ok: [managed-node2] => { "changed": false, "checksum": "0fe3f55e582b3bf7fd00d6665b2e8dcfd0f0d645", "dest": "/etc/realmd.conf", "gid": 0, "group": "root", "mode": "0400", "owner": "root", "path": "/etc/realmd.conf", "secontext": "system_u:object_r:etc_t:s0", "size": 174, "state": "file", "uid": 0 } RUNNING HANDLER [fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services - sssd] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml:10 Friday 20 June 2025 14:51:01 -0400 (0:00:00.758) 0:00:31.955 *********** skipping: [managed-node2] => (item=sssd) => { "ansible_loop_var": "item", "changed": false, "item": "sssd", "skip_reason": "Conditional result was False" } META: ran handlers TASK [Add AD server to existing network connection for DNS] ******************** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:79 Friday 20 June 2025 14:51:01 -0400 (0:00:00.023) 0:00:31.978 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Manage timesync] ********************************************************* task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:93 Friday 20 June 2025 14:51:01 -0400 (0:00:00.014) 0:00:31.993 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Manage crypto policies] ************************************************** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:102 Friday 20 June 2025 14:51:01 -0400 (0:00:00.014) 0:00:32.007 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable crypto policy allowing RC4 encryption] **************************** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:114 Friday 20 June 2025 14:51:01 -0400 (0:00:00.013) 0:00:32.021 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Check if we are already joined to a domain] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:125 Friday 20 June 2025 14:51:01 -0400 (0:00:00.012) 0:00:32.034 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Leave existing joined domain] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:131 Friday 20 June 2025 14:51:01 -0400 (0:00:00.013) 0:00:32.047 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Build Command - Join to a specific Domain Controller] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:139 Friday 20 June 2025 14:51:01 -0400 (0:00:00.013) 0:00:32.060 *********** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Build Join Command - Perform discovery-based realm join operation] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:153 Friday 20 June 2025 14:51:01 -0400 (0:00:00.012) 0:00:32.072 *********** ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Show the join command for debug] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:167 Friday 20 June 2025 14:51:01 -0400 (0:00:00.017) 0:00:32.090 *********** ok: [managed-node2] => {} MSG: ['Would run the following command. Note that ad_integration_join_parameters have been removed for security purposes, the role will pass them to the actual realm join command when running without check mode.', 'realm join -U Administrator --membership-software adcli sample-realm.com'] TASK [fedora.linux_system_roles.ad_integration : Run realm join command] ******* task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:179 Friday 20 June 2025 14:51:01 -0400 (0:00:00.016) 0:00:32.106 *********** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists] ****** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:208 Friday 20 June 2025 14:51:01 -0400 (0:00:00.014) 0:00:32.121 *********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1750445454.571886, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "1d6a87971b82412da0ffc37e56cc5e9db38a3ce5", "ctime": 1750445454.569886, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 339738756, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1750445454.568886, "nlink": 1, "path": "/etc/sssd/sssd.conf", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 265, "uid": 0, "version": "2898385493", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:213 Friday 20 June 2025 14:51:01 -0400 (0:00:00.343) 0:00:32.464 *********** ok: [managed-node2] => { "changed": false, "content": "Cltkb21haW4vc2FtcGxlLXJlYWxtLmNvbV0KZHluZG5zX3VwZGF0ZSA9IFRydWUKZHluZG5zX3R0bCA9IDM2MDAKZHluZG5zX2lmYWNlID0gVEVTVElORwpkeW5kbnNfcmVmcmVzaF9pbnRlcnZhbCA9IDg2NDAwCmR5bmRuc191cGRhdGVfcHRyID0gVHJ1ZQpkeW5kbnNfZm9yY2VfdGNwID0gRmFsc2UKZHluZG5zX2F1dGggPSBHU1MtVFNJRwpkeW5kbnNfc2VydmVyID0gMTI3LjAuMC4xCmFkX2hvc3RuYW1lID0gbWFuYWdlZC1ub2RlMi5zYW1wbGUtcmVhbG0uY29tCg==", "encoding": "base64", "source": "/etc/sssd/sssd.conf" } TASK [fedora.linux_system_roles.ad_integration : Consolidate options from duplicate sections] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:219 Friday 20 June 2025 14:51:02 -0400 (0:00:00.468) 0:00:32.933 *********** TASK [fedora.linux_system_roles.ad_integration : Remove duplicate sections] **** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:234 Friday 20 June 2025 14:51:02 -0400 (0:00:00.049) 0:00:32.982 *********** TASK [fedora.linux_system_roles.ad_integration : Configure SSSD settings] ****** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:246 Friday 20 June 2025 14:51:02 -0400 (0:00:00.055) 0:00:33.037 *********** TASK [fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:260 Friday 20 June 2025 14:51:02 -0400 (0:00:00.017) 0:00:33.055 *********** ok: [managed-node2] => (item={'key': 'dyndns_update', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_update", "value": "True" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 265, "state": "file", "uid": 0, "warnings": [ "The value True (type bool) in a string field was converted to 'True' (type string). If this does not look like what you expect, quote the entire value to ensure it does not change." ] } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_ttl', 'value': '3600'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_ttl", "value": "3600" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 265, "state": "file", "uid": 0 } MSG: OK skipping: [managed-node2] => (item={'key': 'dyndns_iface', 'value': ''}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_iface", "value": "" }, "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item={'key': 'dyndns_refresh_interval', 'value': '86400'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_refresh_interval", "value": "86400" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 265, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_update_ptr', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_update_ptr", "value": "True" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 265, "state": "file", "uid": 0, "warnings": [ "The value True (type bool) in a string field was converted to 'True' (type string). If this does not look like what you expect, quote the entire value to ensure it does not change." ] } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_force_tcp', 'value': 'False'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_force_tcp", "value": "False" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 265, "state": "file", "uid": 0, "warnings": [ "The value False (type bool) in a string field was converted to 'False' (type string). If this does not look like what you expect, quote the entire value to ensure it does not change." ] } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_auth', 'value': 'GSS-TSIG'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_auth", "value": "GSS-TSIG" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 265, "state": "file", "uid": 0 } MSG: OK skipping: [managed-node2] => (item={'key': 'dyndns_server', 'value': ''}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_server", "value": "" }, "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item={'key': 'ad_hostname', 'value': 'managed-node2.sample-realm.com'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "ad_hostname", "value": "managed-node2.sample-realm.com" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 265, "state": "file", "uid": 0 } MSG: OK TASK [fedora.linux_system_roles.ad_integration : Configure custom SSSD settings] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 Friday 20 June 2025 14:51:05 -0400 (0:00:02.639) 0:00:35.695 *********** TASK [fedora.linux_system_roles.ad_integration : Cleanup dynamic DNS configuration options] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:315 Friday 20 June 2025 14:51:05 -0400 (0:00:00.012) 0:00:35.708 *********** changed: [managed-node2] => (item={'key': 'dyndns_iface', 'value': ''}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_iface", "value": "" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 242, "state": "file", "uid": 0 } MSG: option changed changed: [managed-node2] => (item={'key': 'dyndns_server', 'value': ''}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_server", "value": "" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 216, "state": "file", "uid": 0 } MSG: option changed NOTIFIED HANDLER fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services - sssd for managed-node2 TASK [Restart sssd] ************************************************************ task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:83 Friday 20 June 2025 14:51:05 -0400 (0:00:00.681) 0:00:36.389 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check custom dyndns settings are removed] ******************************** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:89 Friday 20 June 2025 14:51:05 -0400 (0:00:00.016) 0:00:36.405 *********** ok: [managed-node2] => (item={'key': 'dyndns_iface', 'value': None}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_iface", "value": null }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 216, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_server', 'value': None}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_server", "value": null }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "system_u:object_r:sssd_conf_t:s0", "size": 216, "state": "file", "uid": 0 } MSG: OK TASK [Gather facts] ************************************************************ task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:110 Friday 20 June 2025 14:51:06 -0400 (0:00:00.631) 0:00:37.037 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get IP for host's FQDN] ************************************************** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:116 Friday 20 June 2025 14:51:06 -0400 (0:00:00.015) 0:00:37.053 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get hostname for host's IP address] ************************************** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:122 Friday 20 June 2025 14:51:06 -0400 (0:00:00.015) 0:00:37.068 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Assert IPv4 DNS records were created] ************************************ task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:128 Friday 20 June 2025 14:51:06 -0400 (0:00:00.021) 0:00:37.089 *********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } RUNNING HANDLER [fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services - sssd] *** task path: /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml:10 Friday 20 June 2025 14:51:06 -0400 (0:00:00.013) 0:00:37.102 *********** skipping: [managed-node2] => (item=sssd) => { "ansible_loop_var": "item", "changed": false, "item": "sssd", "skip_reason": "Conditional result was False" } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node2 : ok=24 changed=5 unreachable=0 failed=0 skipped=52 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Friday 20 June 2025 14:51:06 -0400 (0:00:00.019) 0:00:37.121 *********** =============================================================================== fedora.linux_system_roles.ad_integration : Ensure required packages are installed -- 17.65s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates --- 3.26s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:260 fedora.linux_system_roles.ad_integration : Ensure required packages are installed --- 3.16s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates --- 2.64s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:260 Check custom dyndns settings -------------------------------------------- 1.32s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:41 fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started --- 1.10s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 fedora.linux_system_roles.ad_integration : Ensure ansible_facts used by role --- 0.86s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:2 fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started --- 0.79s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf ---- 0.76s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf ---- 0.75s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 fedora.linux_system_roles.ad_integration : Cleanup dynamic DNS configuration options --- 0.68s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:315 Check custom dyndns settings are removed -------------------------------- 0.63s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:89 fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists --- 0.47s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:213 fedora.linux_system_roles.ad_integration : Check if system is ostree ---- 0.42s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Search /var/log/sssd/sssd.log for [sss_ini_call_validators] ------------- 0.40s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:64 fedora.linux_system_roles.ad_integration : See if sssd.conf exists ------ 0.34s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:208 fedora.linux_system_roles.ad_integration : See if sssd.conf exists ------ 0.32s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:208 Test - Run the system role with bogus vars ------------------------------ 0.07s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:31 Test - Re-run the system role to remove vars ---------------------------- 0.06s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:75 fedora.linux_system_roles.ad_integration : Remove duplicate sections ---- 0.06s /tmp/collections-mC1/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:234