[WARNING]: Could not match supplied host pattern, ignoring: ad ansible-playbook [core 2.16.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-eJP executable location = /usr/local/bin/ansible-playbook python version = 3.12.1 (main, Feb 21 2024, 14:18:26) [GCC 8.5.0 20210514 (Red Hat 8.5.0-21)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_dyndns.yml ***************************************************** 1 plays in /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml PLAY [Ensure that the role configures dynamic dns] ***************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:20 Saturday 13 September 2025 08:19:50 -0400 (0:00:00.033) 0:00:00.033 **** ok: [managed-node2] TASK [Setup fake realm] ******************************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:33 Saturday 13 September 2025 08:19:51 -0400 (0:00:01.180) 0:00:01.213 **** included: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml for managed-node2 TASK [Get role variables] ****************************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:6 Saturday 13 September 2025 08:19:51 -0400 (0:00:00.045) 0:00:01.259 **** TASK [fedora.linux_system_roles.ad_integration : Ensure ansible_facts used by role] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:2 Saturday 13 September 2025 08:19:51 -0400 (0:00:00.032) 0:00:01.291 **** skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Check if system is ostree] **** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Saturday 13 September 2025 08:19:51 -0400 (0:00:00.034) 0:00:01.326 **** ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.ad_integration : Set flag to indicate system is ostree] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:15 Saturday 13 September 2025 08:19:52 -0400 (0:00:00.403) 0:00:01.730 **** ok: [managed-node2] => { "ansible_facts": { "__ad_integration_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 Saturday 13 September 2025 08:19:52 -0400 (0:00:00.022) 0:00:01.753 **** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Create a temp file for fake realm cmd] *********************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:12 Saturday 13 September 2025 08:19:52 -0400 (0:00:00.036) 0:00:01.789 **** changed: [managed-node2] => { "changed": true, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/lsr_7_zzku59_ad_int_realm.py", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Set realm cmd variable for remainder of test] **************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:18 Saturday 13 September 2025 08:19:52 -0400 (0:00:00.393) 0:00:02.183 **** ok: [managed-node2] => { "ansible_facts": { "__ad_integration_realm_cmd": "/tmp/lsr_7_zzku59_ad_int_realm.py" }, "changed": false } TASK [Create fake realm cmd] *************************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:22 Saturday 13 September 2025 08:19:52 -0400 (0:00:00.018) 0:00:02.201 **** changed: [managed-node2] => { "changed": true, "checksum": "30318e4f54519605d60caa5bc62e429287b28973", "dest": "/tmp/lsr_7_zzku59_ad_int_realm.py", "gid": 0, "group": "root", "md5sum": "3ea3ed87c4442dcbe51dfff237c430ed", "mode": "0755", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 1867, "src": "/root/.ansible/tmp/ansible-tmp-1757765992.6514192-8192-170558517844569/source", "state": "file", "uid": 0 } TASK [Check if /etc/sssd exists] *********************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:28 Saturday 13 September 2025 08:19:53 -0400 (0:00:00.738) 0:00:02.940 **** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1716968740.483, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1716968740.245, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 993, "gr_name": "sssd", "inode": 7060576, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0700", "mtime": 1716968740.245, "nlink": 4, "path": "/etc/sssd", "pw_name": "sssd", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 31, "uid": 996, "version": "3583498373", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": true } } TASK [Install sssd-common for /etc/sssd] *************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:33 Saturday 13 September 2025 08:19:53 -0400 (0:00:00.317) 0:00:03.257 **** skipping: [managed-node2] => { "changed": false, "false_condition": "not __sssd_dir_stat.stat.exists", "skip_reason": "Conditional result was False" } TASK [Remove realm cmd] ******************************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:44 Saturday 13 September 2025 08:19:53 -0400 (0:00:00.015) 0:00:03.272 **** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"cleanup\"", "skip_reason": "Conditional result was False" } TASK [Remove sssd-common] ****************************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:49 Saturday 13 September 2025 08:19:53 -0400 (0:00:00.013) 0:00:03.285 **** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"cleanup\"", "skip_reason": "Conditional result was False" } TASK [Test - Run the system role with bogus vars] ****************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:39 Saturday 13 September 2025 08:19:53 -0400 (0:00:00.012) 0:00:03.298 **** TASK [fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:3 Saturday 13 September 2025 08:19:53 -0400 (0:00:00.058) 0:00:03.356 **** skipping: [managed-node2] => { "changed": false, "false_condition": "not ad_integration_realm", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing timesync if timesource is set] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:8 Saturday 13 September 2025 08:19:53 -0400 (0:00:00.013) 0:00:03.370 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_timesync_source is not none", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure time source is provided if managing timesync] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:15 Saturday 13 September 2025 08:19:53 -0400 (0:00:00.017) 0:00:03.387 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_timesync | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing crypto policies if allow_rc4_crypto is set] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:25 Saturday 13 September 2025 08:19:53 -0400 (0:00:00.033) 0:00:03.420 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure manage_crypt_policies is set with crypto_allow_rc4] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:30 Saturday 13 September 2025 08:19:53 -0400 (0:00:00.032) 0:00:03.453 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure all required dns variables are provided] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:40 Saturday 13 September 2025 08:19:53 -0400 (0:00:00.031) 0:00:03.484 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_dns | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:49 Saturday 13 September 2025 08:19:53 -0400 (0:00:00.032) 0:00:03.516 **** included: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.ad_integration : Ensure ansible_facts used by role] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:2 Saturday 13 September 2025 08:19:53 -0400 (0:00:00.022) 0:00:03.538 **** skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Check if system is ostree] **** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Saturday 13 September 2025 08:19:53 -0400 (0:00:00.035) 0:00:03.574 **** skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set flag to indicate system is ostree] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:15 Saturday 13 September 2025 08:19:54 -0400 (0:00:00.018) 0:00:03.593 **** skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 Saturday 13 September 2025 08:19:54 -0400 (0:00:00.019) 0:00:03.613 **** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.ad_integration : Ensure required packages are installed] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 Saturday 13 September 2025 08:19:54 -0400 (0:00:00.039) 0:00:03.652 **** changed: [managed-node2] => { "changed": true, "rc": 0, "results": [ "Installed: dejavu-sans-mono-fonts-2.35-7.el8.noarch", "Installed: gsettings-desktop-schemas-3.32.0-6.el8.x86_64", "Installed: realmd-0.17.1-2.el8.x86_64", "Installed: PackageKit-glib-1.1.12-7.el8.x86_64", "Installed: abattis-cantarell-fonts-0.0.25-6.el8.noarch", "Installed: libproxy-0.4.15-5.2.el8.x86_64", "Installed: gdk-pixbuf2-2.36.12-5.el8.x86_64", "Installed: libstemmer-0-10.585svn.el8.x86_64", "Installed: glib-networking-2.56.1-1.1.el8.x86_64", "Installed: libmodman-2.0.1-17.el8.x86_64", "Installed: json-glib-1.4.4-1.el8.x86_64", "Installed: libappstream-glib-0.7.14-3.el8.x86_64", "Installed: fontpackages-filesystem-1.44-22.el8.noarch", "Installed: libsoup-2.62.3-5.el8.x86_64", "Installed: PackageKit-1.1.12-7.el8.x86_64", "Installed: dejavu-fonts-common-2.35-7.el8.noarch" ] } lsrpackages: PackageKit realmd TASK [fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 Saturday 13 September 2025 08:20:10 -0400 (0:00:16.925) 0:00:20.578 **** changed: [managed-node2] => (item=realmd) => { "ansible_loop_var": "item", "changed": true, "enabled": true, "item": "realmd", "name": "realmd", "state": "started", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "sysinit.target system.slice systemd-journald.socket dbus.socket basic.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.freedesktop.realmd", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Realm and Domain Configuration", "DevicePolicy": "auto", "Documentation": "man:realm(8) man:realmd.conf(5)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/libexec/realmd ; argv[]=/usr/libexec/realmd ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/realmd.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "realmd.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "realmd.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.socket sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "static", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf] **** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 Saturday 13 September 2025 08:20:11 -0400 (0:00:00.780) 0:00:21.358 **** Notification for handler Handler for ad_integration to restart services has been saved. changed: [managed-node2] => { "changed": true, "checksum": "7e0c9eddf5cee60f782f39e0f445b043ab4bcb61", "dest": "/etc/realmd.conf", "gid": 0, "group": "root", "md5sum": "59e15d6f22a95d67b152af5a634072a8", "mode": "0400", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 181, "src": "/root/.ansible/tmp/ansible-tmp-1757766011.8085394-8467-126209531946556/source", "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.ad_integration : Flush handlers] *************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:75 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.669) 0:00:22.028 **** NOTIFIED HANDLER fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services for managed-node2 META: triggered running handlers for managed-node2 RUNNING HANDLER [fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml:3 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.004) 0:00:22.033 **** skipping: [managed-node2] => (item=realmd) => { "ansible_loop_var": "item", "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | default(false)", "item": "realmd", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Add AD server to existing network connection for DNS] ******************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:79 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.051) 0:00:22.085 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_dns | bool", "skip_reason": "Conditional result was False" } TASK [Manage timesync] ********************************************************* task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:93 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.033) 0:00:22.118 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_timesync | bool", "skip_reason": "Conditional result was False" } TASK [Manage crypto policies] ************************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:102 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.029) 0:00:22.147 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_crypto_policies | bool", "skip_reason": "Conditional result was False" } TASK [Enable crypto policy allowing RC4 encryption] **************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:114 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.031) 0:00:22.179 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists] ****** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:130 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.030) 0:00:22.209 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_sssd_merge_duplicate_sections | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:135 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.031) 0:00:22.241 **** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set variables we will need for merging - 1] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:142 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.050) 0:00:22.291 **** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set variables we will need for merging - 2] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:147 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.036) 0:00:22.328 **** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Check if we are already joined to a domain] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:153 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.031) 0:00:22.360 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_force_rejoin | bool or __ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Leave existing joined domain] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:161 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.032) 0:00:22.392 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_force_rejoin | bool or __ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm leave] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:174 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.036) 0:00:22.429 **** skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Remove duplicate sections] **** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:179 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.032) 0:00:22.461 **** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Build Command - Join to a specific Domain Controller] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:191 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.033) 0:00:22.494 **** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Build Join Command - Perform discovery-based realm join operation] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:205 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.020) 0:00:22.515 **** ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Show the join command for debug] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:219 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.037) 0:00:22.552 **** skipping: [managed-node2] => { "false_condition": "ad_integration_join_to_dc == __ad_integration_sample_dc or ad_integration_realm == __ad_integration_sample_realm or ansible_check_mode" } TASK [fedora.linux_system_roles.ad_integration : Run realm join command] ******* task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:231 Saturday 13 September 2025 08:20:12 -0400 (0:00:00.012) 0:00:22.565 **** changed: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm join] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:258 Saturday 13 September 2025 08:20:13 -0400 (0:00:00.464) 0:00:23.029 **** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1757766013.3813543, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "049911c7517fba993eeb39dc494de8bf33faa685", "ctime": 1757766013.3803542, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 7074497, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1757766013.3803542, "nlink": 1, "path": "/etc/sssd/sssd.conf", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 87, "uid": 0, "version": "2036297279", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists after realm join] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:263 Saturday 13 September 2025 08:20:13 -0400 (0:00:00.396) 0:00:23.425 **** ok: [managed-node2] => { "changed": false, "content": "W2RvbWFpbi9keW5kbnMtc2FtcGxlLXJlYWxtLmNvbV0KYWRfZG9tYWluID0gZHluZG5zLXNhbXBsZS1yZWFsbS5jb20KaWRfcHJvdmlkZXIgPSBhZAoK", "encoding": "base64", "source": "/etc/sssd/sssd.conf" } TASK [fedora.linux_system_roles.ad_integration : Consolidate options from duplicate sections] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:269 Saturday 13 September 2025 08:20:14 -0400 (0:00:00.431) 0:00:23.857 **** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Configure SSSD settings] ****** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:284 Saturday 13 September 2025 08:20:14 -0400 (0:00:00.034) 0:00:23.892 **** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 Saturday 13 September 2025 08:20:14 -0400 (0:00:00.013) 0:00:23.906 **** changed: [managed-node2] => (item={'key': 'dyndns_update', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_update", "value": "True" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 108, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_ttl', 'value': '3600'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_ttl", "value": "3600" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 126, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_iface', 'value': 'TESTING'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_iface", "value": "TESTING" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 149, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_refresh_interval', 'value': '86400'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_refresh_interval", "value": "86400" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 181, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_update_ptr', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_update_ptr", "value": "True" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 206, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_force_tcp', 'value': 'False'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_force_tcp", "value": "False" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 231, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_auth', 'value': 'GSS-TSIG'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_auth", "value": "GSS-TSIG" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 254, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'dyndns_server', 'value': '127.0.0.1'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_server", "value": "127.0.0.1" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 280, "state": "file", "uid": 0 } MSG: option added changed: [managed-node2] => (item={'key': 'ad_hostname', 'value': 'managed-node2.dyndns-sample-realm.com'}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "ad_hostname", "value": "managed-node2.dyndns-sample-realm.com" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: option added Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. TASK [fedora.linux_system_roles.ad_integration : Configure custom SSSD settings] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:336 Saturday 13 September 2025 08:20:17 -0400 (0:00:03.179) 0:00:27.086 **** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Cleanup dynamic DNS configuration options] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:353 Saturday 13 September 2025 08:20:17 -0400 (0:00:00.017) 0:00:27.103 **** skipping: [managed-node2] => (item={'key': 'dyndns_iface', 'value': 'TESTING'}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item.value is none or item.value == ''", "item": { "key": "dyndns_iface", "value": "TESTING" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item={'key': 'dyndns_server', 'value': '127.0.0.1'}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item.value is none or item.value == ''", "item": { "key": "dyndns_server", "value": "127.0.0.1" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Check custom dyndns settings] ******************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:49 Saturday 13 September 2025 08:20:17 -0400 (0:00:00.081) 0:00:27.185 **** ok: [managed-node2] => (item={'key': 'dyndns_update', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_update", "value": "True" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_iface', 'value': 'TESTING'}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_iface", "value": "TESTING" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_auth', 'value': 'GSS-TSIG'}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_auth", "value": "GSS-TSIG" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_server', 'value': '127.0.0.1'}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_server", "value": "127.0.0.1" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK TASK [Search /var/log/sssd/sssd.log for [sss_ini_call_validators]] ************* task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:72 Saturday 13 September 2025 08:20:18 -0400 (0:00:01.273) 0:00:28.459 **** ok: [managed-node2] => { "changed": false, "cmd": [ "grep", "-i", "sss_ini_call_validators", "/var/log/sssd/sssd.log" ], "delta": "0:00:00.002498", "end": "2025-09-13 08:20:19.139086", "failed_when_result": false, "rc": 2, "start": "2025-09-13 08:20:19.136588" } STDERR: grep: /var/log/sssd/sssd.log: No such file or directory MSG: non-zero return code TASK [Fail if signature found] ************************************************* task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:78 Saturday 13 September 2025 08:20:19 -0400 (0:00:00.307) 0:00:28.767 **** skipping: [managed-node2] => { "changed": false, "false_condition": "sssd_log.stdout | length > 0", "skip_reason": "Conditional result was False" } TASK [Test - Re-run the system role to remove vars] **************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:83 Saturday 13 September 2025 08:20:19 -0400 (0:00:00.033) 0:00:28.801 **** TASK [fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:3 Saturday 13 September 2025 08:20:19 -0400 (0:00:00.069) 0:00:28.871 **** skipping: [managed-node2] => { "changed": false, "false_condition": "not ad_integration_realm", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing timesync if timesource is set] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:8 Saturday 13 September 2025 08:20:19 -0400 (0:00:00.012) 0:00:28.883 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_timesync_source is not none", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure time source is provided if managing timesync] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:15 Saturday 13 September 2025 08:20:19 -0400 (0:00:00.016) 0:00:28.899 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_timesync | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing crypto policies if allow_rc4_crypto is set] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:25 Saturday 13 September 2025 08:20:19 -0400 (0:00:00.029) 0:00:28.929 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure manage_crypt_policies is set with crypto_allow_rc4] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:30 Saturday 13 September 2025 08:20:19 -0400 (0:00:00.034) 0:00:28.963 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure all required dns variables are provided] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:40 Saturday 13 September 2025 08:20:19 -0400 (0:00:00.049) 0:00:29.013 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_dns | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:49 Saturday 13 September 2025 08:20:19 -0400 (0:00:00.047) 0:00:29.060 **** included: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.ad_integration : Ensure ansible_facts used by role] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:2 Saturday 13 September 2025 08:20:19 -0400 (0:00:00.033) 0:00:29.094 **** skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Check if system is ostree] **** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Saturday 13 September 2025 08:20:19 -0400 (0:00:00.051) 0:00:29.146 **** skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set flag to indicate system is ostree] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:15 Saturday 13 September 2025 08:20:19 -0400 (0:00:00.030) 0:00:29.176 **** skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 Saturday 13 September 2025 08:20:19 -0400 (0:00:00.026) 0:00:29.202 **** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.ad_integration : Ensure required packages are installed] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 Saturday 13 September 2025 08:20:19 -0400 (0:00:00.048) 0:00:29.251 **** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: PackageKit realmd TASK [fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 Saturday 13 September 2025 08:20:22 -0400 (0:00:02.458) 0:00:31.709 **** ok: [managed-node2] => (item=realmd) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "realmd", "name": "realmd", "state": "started", "status": { "ActiveEnterTimestamp": "Sat 2025-09-13 08:20:11 EDT", "ActiveEnterTimestampMonotonic": "198729594", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "sysinit.target systemd-journald.socket basic.target system.slice dbus.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Sat 2025-09-13 08:20:11 EDT", "AssertTimestampMonotonic": "198702193", "Before": "shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.freedesktop.realmd", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Sat 2025-09-13 08:20:11 EDT", "ConditionTimestampMonotonic": "198702192", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroup": "/system.slice/realmd.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Realm and Domain Configuration", "DevicePolicy": "auto", "Documentation": "man:realm(8) man:realmd.conf(5)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8745", "ExecMainStartTimestamp": "Sat 2025-09-13 08:20:11 EDT", "ExecMainStartTimestampMonotonic": "198710340", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/libexec/realmd ; argv[]=/usr/libexec/realmd ; ignore_errors=no ; start_time=[Sat 2025-09-13 08:20:11 EDT] ; stop_time=[n/a] ; pid=8745 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/realmd.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "realmd.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Sat 2025-09-13 08:20:11 EDT", "InactiveExitTimestampMonotonic": "198710381", "InvocationID": "ba6ab70bceb54db68c0725cac9170ed7", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "8745", "MemoryAccounting": "yes", "MemoryCurrent": "2117632", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "realmd.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target system.slice dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2025-09-13 08:20:11 EDT", "StateChangeTimestampMonotonic": "198729594", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "static", "UtmpMode": "init", "WatchdogTimestamp": "Sat 2025-09-13 08:20:11 EDT", "WatchdogTimestampMonotonic": "198729592", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf] **** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 Saturday 13 September 2025 08:20:22 -0400 (0:00:00.466) 0:00:32.176 **** ok: [managed-node2] => { "changed": false, "checksum": "7e0c9eddf5cee60f782f39e0f445b043ab4bcb61", "dest": "/etc/realmd.conf", "gid": 0, "group": "root", "mode": "0400", "owner": "root", "path": "/etc/realmd.conf", "secontext": "system_u:object_r:etc_t:s0", "size": 181, "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.ad_integration : Flush handlers] *************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:75 Saturday 13 September 2025 08:20:23 -0400 (0:00:00.750) 0:00:32.927 **** NOTIFIED HANDLER fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services - sssd for managed-node2 META: triggered running handlers for managed-node2 RUNNING HANDLER [fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services - sssd] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml:10 Saturday 13 September 2025 08:20:23 -0400 (0:00:00.005) 0:00:32.932 **** skipping: [managed-node2] => (item=sssd) => { "ansible_loop_var": "item", "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | default(false)", "item": "sssd", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Add AD server to existing network connection for DNS] ******************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:79 Saturday 13 September 2025 08:20:23 -0400 (0:00:00.062) 0:00:32.994 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_dns | bool", "skip_reason": "Conditional result was False" } TASK [Manage timesync] ********************************************************* task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:93 Saturday 13 September 2025 08:20:23 -0400 (0:00:00.051) 0:00:33.046 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_timesync | bool", "skip_reason": "Conditional result was False" } TASK [Manage crypto policies] ************************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:102 Saturday 13 September 2025 08:20:23 -0400 (0:00:00.043) 0:00:33.089 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_crypto_policies | bool", "skip_reason": "Conditional result was False" } TASK [Enable crypto policy allowing RC4 encryption] **************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:114 Saturday 13 September 2025 08:20:23 -0400 (0:00:00.045) 0:00:33.135 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists] ****** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:130 Saturday 13 September 2025 08:20:23 -0400 (0:00:00.044) 0:00:33.180 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_sssd_merge_duplicate_sections | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:135 Saturday 13 September 2025 08:20:23 -0400 (0:00:00.046) 0:00:33.226 **** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set variables we will need for merging - 1] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:142 Saturday 13 September 2025 08:20:23 -0400 (0:00:00.042) 0:00:33.269 **** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set variables we will need for merging - 2] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:147 Saturday 13 September 2025 08:20:23 -0400 (0:00:00.050) 0:00:33.319 **** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Check if we are already joined to a domain] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:153 Saturday 13 September 2025 08:20:23 -0400 (0:00:00.049) 0:00:33.368 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_force_rejoin | bool or __ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Leave existing joined domain] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:161 Saturday 13 September 2025 08:20:23 -0400 (0:00:00.052) 0:00:33.421 **** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_force_rejoin | bool or __ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm leave] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:174 Saturday 13 September 2025 08:20:23 -0400 (0:00:00.050) 0:00:33.472 **** skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Remove duplicate sections] **** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:179 Saturday 13 September 2025 08:20:23 -0400 (0:00:00.050) 0:00:33.523 **** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Build Command - Join to a specific Domain Controller] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:191 Saturday 13 September 2025 08:20:23 -0400 (0:00:00.053) 0:00:33.577 **** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Build Join Command - Perform discovery-based realm join operation] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:205 Saturday 13 September 2025 08:20:24 -0400 (0:00:00.028) 0:00:33.606 **** ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Show the join command for debug] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:219 Saturday 13 September 2025 08:20:24 -0400 (0:00:00.058) 0:00:33.664 **** skipping: [managed-node2] => { "false_condition": "ad_integration_join_to_dc == __ad_integration_sample_dc or ad_integration_realm == __ad_integration_sample_realm or ansible_check_mode" } TASK [fedora.linux_system_roles.ad_integration : Run realm join command] ******* task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:231 Saturday 13 September 2025 08:20:24 -0400 (0:00:00.021) 0:00:33.685 **** changed: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm join] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:258 Saturday 13 September 2025 08:20:24 -0400 (0:00:00.394) 0:00:34.080 **** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1757766017.4343593, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f1525f35c7b671e667311b43498d537dc0b75297", "ctime": 1757766017.4333594, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 304087234, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1757766017.4323592, "nlink": 1, "path": "/etc/sssd/sssd.conf", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 332, "uid": 0, "version": "4093424412", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists after realm join] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:263 Saturday 13 September 2025 08:20:24 -0400 (0:00:00.384) 0:00:34.464 **** ok: [managed-node2] => { "changed": false, "content": "W2RvbWFpbi9keW5kbnMtc2FtcGxlLXJlYWxtLmNvbV0KYWRfZG9tYWluID0gZHluZG5zLXNhbXBsZS1yZWFsbS5jb20KaWRfcHJvdmlkZXIgPSBhZApkeW5kbnNfdXBkYXRlID0gVHJ1ZQpkeW5kbnNfdHRsID0gMzYwMApkeW5kbnNfaWZhY2UgPSBURVNUSU5HCmR5bmRuc19yZWZyZXNoX2ludGVydmFsID0gODY0MDAKZHluZG5zX3VwZGF0ZV9wdHIgPSBUcnVlCmR5bmRuc19mb3JjZV90Y3AgPSBGYWxzZQpkeW5kbnNfYXV0aCA9IEdTUy1UU0lHCmR5bmRuc19zZXJ2ZXIgPSAxMjcuMC4wLjEKYWRfaG9zdG5hbWUgPSBtYW5hZ2VkLW5vZGUyLmR5bmRucy1zYW1wbGUtcmVhbG0uY29tCgo=", "encoding": "base64", "source": "/etc/sssd/sssd.conf" } TASK [fedora.linux_system_roles.ad_integration : Consolidate options from duplicate sections] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:269 Saturday 13 September 2025 08:20:25 -0400 (0:00:00.361) 0:00:34.826 **** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Configure SSSD settings] ****** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:284 Saturday 13 September 2025 08:20:25 -0400 (0:00:00.049) 0:00:34.876 **** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 Saturday 13 September 2025 08:20:25 -0400 (0:00:00.018) 0:00:34.895 **** ok: [managed-node2] => (item={'key': 'dyndns_update', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_update", "value": "True" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_ttl', 'value': '3600'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_ttl", "value": "3600" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK skipping: [managed-node2] => (item={'key': 'dyndns_iface', 'value': ''}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item.value != ''", "item": { "key": "dyndns_iface", "value": "" }, "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item={'key': 'dyndns_refresh_interval', 'value': '86400'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_refresh_interval", "value": "86400" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_update_ptr', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_update_ptr", "value": "True" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_force_tcp', 'value': 'False'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "dyndns_force_tcp", "value": "False" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK skipping: [managed-node2] => (item={'key': 'dyndns_auth', 'value': ''}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item.value != ''", "item": { "key": "dyndns_auth", "value": "" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item={'key': 'dyndns_server', 'value': ''}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item.value != ''", "item": { "key": "dyndns_server", "value": "" }, "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item={'key': 'ad_hostname', 'value': 'managed-node2.dyndns-sample-realm.com'}) => { "ansible_loop_var": "item", "changed": false, "gid": 0, "group": "root", "item": { "key": "ad_hostname", "value": "managed-node2.dyndns-sample-realm.com" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 332, "state": "file", "uid": 0 } MSG: OK TASK [fedora.linux_system_roles.ad_integration : Configure custom SSSD settings] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:336 Saturday 13 September 2025 08:20:27 -0400 (0:00:02.087) 0:00:36.982 **** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Cleanup dynamic DNS configuration options] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:353 Saturday 13 September 2025 08:20:27 -0400 (0:00:00.012) 0:00:36.994 **** changed: [managed-node2] => (item={'key': 'dyndns_iface', 'value': ''}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_iface", "value": "" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 309, "state": "file", "uid": 0 } MSG: option changed changed: [managed-node2] => (item={'key': 'dyndns_server', 'value': ''}) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": { "key": "dyndns_server", "value": "" }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 283, "state": "file", "uid": 0 } MSG: option changed Notification for handler Handler for ad_integration to restart services - sssd has been saved. Notification for handler Handler for ad_integration to restart services - sssd has been saved. TASK [Restart sssd] ************************************************************ task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:91 Saturday 13 September 2025 08:20:28 -0400 (0:00:00.693) 0:00:37.687 **** skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | d(false)", "skip_reason": "Conditional result was False" } TASK [Check custom dyndns settings are removed] ******************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:97 Saturday 13 September 2025 08:20:28 -0400 (0:00:00.034) 0:00:37.722 **** ok: [managed-node2] => (item={'key': 'dyndns_iface', 'value': None}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_iface", "value": null }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 283, "state": "file", "uid": 0 } MSG: OK ok: [managed-node2] => (item={'key': 'dyndns_server', 'value': None}) => { "ansible_loop_var": "item", "changed": false, "failed_when_result": false, "gid": 0, "group": "root", "item": { "key": "dyndns_server", "value": null }, "mode": "0600", "owner": "root", "path": "/etc/sssd/sssd.conf", "secontext": "unconfined_u:object_r:sssd_conf_t:s0", "size": 283, "state": "file", "uid": 0 } MSG: OK TASK [Gather facts] ************************************************************ task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:118 Saturday 13 September 2025 08:20:28 -0400 (0:00:00.659) 0:00:38.381 **** skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | d(false)", "skip_reason": "Conditional result was False" } TASK [Get IP for host's FQDN] ************************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:124 Saturday 13 September 2025 08:20:28 -0400 (0:00:00.036) 0:00:38.417 **** skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | d(false)", "skip_reason": "Conditional result was False" } TASK [Get hostname for host's IP address] ************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:130 Saturday 13 September 2025 08:20:28 -0400 (0:00:00.033) 0:00:38.451 **** skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | d(false)", "skip_reason": "Conditional result was False" } TASK [Assert IPv4 DNS records were created] ************************************ task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:136 Saturday 13 September 2025 08:20:28 -0400 (0:00:00.034) 0:00:38.485 **** skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | d(false)", "skip_reason": "Conditional result was False" } TASK [Cleanup fake realm] ****************************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:144 Saturday 13 September 2025 08:20:28 -0400 (0:00:00.033) 0:00:38.518 **** included: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml for managed-node2 TASK [Get role variables] ****************************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:6 Saturday 13 September 2025 08:20:28 -0400 (0:00:00.066) 0:00:38.585 **** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Create a temp file for fake realm cmd] *********************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:12 Saturday 13 September 2025 08:20:29 -0400 (0:00:00.013) 0:00:38.599 **** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Set realm cmd variable for remainder of test] **************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:18 Saturday 13 September 2025 08:20:29 -0400 (0:00:00.013) 0:00:38.612 **** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Create fake realm cmd] *************************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:22 Saturday 13 September 2025 08:20:29 -0400 (0:00:00.012) 0:00:38.625 **** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Check if /etc/sssd exists] *********************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:28 Saturday 13 September 2025 08:20:29 -0400 (0:00:00.013) 0:00:38.638 **** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Install sssd-common for /etc/sssd] *************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:33 Saturday 13 September 2025 08:20:29 -0400 (0:00:00.012) 0:00:38.651 **** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Remove realm cmd] ******************************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:44 Saturday 13 September 2025 08:20:29 -0400 (0:00:00.015) 0:00:38.666 **** changed: [managed-node2] => { "changed": true, "path": "/tmp/lsr_7_zzku59_ad_int_realm.py", "state": "absent" } TASK [Remove sssd-common] ****************************************************** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:49 Saturday 13 September 2025 08:20:29 -0400 (0:00:00.331) 0:00:38.998 **** skipping: [managed-node2] => { "changed": false, "false_condition": "__installed_sssd_package is changed", "skip_reason": "Conditional result was False" } NOTIFIED HANDLER fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services - sssd for managed-node2 RUNNING HANDLER [fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services - sssd] *** task path: /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml:10 Saturday 13 September 2025 08:20:29 -0400 (0:00:00.021) 0:00:39.019 **** skipping: [managed-node2] => (item=sssd) => { "ansible_loop_var": "item", "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | default(false)", "item": "sssd", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped PLAY RECAP ********************************************************************* managed-node2 : ok=32 changed=10 unreachable=0 failed=0 skipped=76 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Saturday 13 September 2025 08:20:29 -0400 (0:00:00.036) 0:00:39.056 **** =============================================================================== fedora.linux_system_roles.ad_integration : Ensure required packages are installed -- 16.93s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates --- 3.18s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 fedora.linux_system_roles.ad_integration : Ensure required packages are installed --- 2.46s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates --- 2.09s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 Check custom dyndns settings -------------------------------------------- 1.27s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:49 Gathering Facts --------------------------------------------------------- 1.18s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:20 fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started --- 0.78s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf ---- 0.75s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 Create fake realm cmd --------------------------------------------------- 0.74s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:22 fedora.linux_system_roles.ad_integration : Cleanup dynamic DNS configuration options --- 0.69s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:353 fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf ---- 0.67s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 Check custom dyndns settings are removed -------------------------------- 0.66s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:97 fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started --- 0.47s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 fedora.linux_system_roles.ad_integration : Run realm join command ------- 0.46s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:231 fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists after realm join --- 0.43s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:263 fedora.linux_system_roles.ad_integration : Check if system is ostree ---- 0.40s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm join --- 0.40s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:258 fedora.linux_system_roles.ad_integration : Run realm join command ------- 0.39s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:231 Create a temp file for fake realm cmd ----------------------------------- 0.39s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:12 fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm join --- 0.38s /tmp/collections-eJP/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:258