ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles [WARNING]: Could not match supplied host pattern, ignoring: ad Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_dyndns.yml ***************************************************** 1 plays in /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml PLAY [Ensure that the role configures dynamic dns] ***************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:20 Friday 14 November 2025 08:53:19 -0500 (0:00:00.018) 0:00:00.018 ******* ok: [managed-node2] META: ran handlers TASK [Setup fake realm] ******************************************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:33 Friday 14 November 2025 08:53:20 -0500 (0:00:01.002) 0:00:01.020 ******* included: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml for managed-node2 TASK [Get role variables] ****************************************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:6 Friday 14 November 2025 08:53:20 -0500 (0:00:00.043) 0:00:01.064 ******* TASK [fedora.linux_system_roles.ad_integration : Ensure ansible_facts used by role] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:2 Friday 14 November 2025 08:53:20 -0500 (0:00:00.048) 0:00:01.112 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Check if system is ostree] **** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Friday 14 November 2025 08:53:20 -0500 (0:00:00.036) 0:00:01.148 ******* ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.ad_integration : Set flag to indicate system is ostree] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:15 Friday 14 November 2025 08:53:21 -0500 (0:00:00.458) 0:00:01.607 ******* ok: [managed-node2] => { "ansible_facts": { "__ad_integration_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 Friday 14 November 2025 08:53:21 -0500 (0:00:00.038) 0:00:01.645 ******* skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } TASK [Create a temp file for fake realm cmd] *********************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:12 Friday 14 November 2025 08:53:21 -0500 (0:00:00.057) 0:00:01.703 ******* changed: [managed-node2] => { "changed": true, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/lsr_a6djy2_9_ad_int_realm.py", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Set realm cmd variable for remainder of test] **************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:18 Friday 14 November 2025 08:53:21 -0500 (0:00:00.418) 0:00:02.121 ******* ok: [managed-node2] => { "ansible_facts": { "__ad_integration_realm_cmd": "/tmp/lsr_a6djy2_9_ad_int_realm.py" }, "changed": false } TASK [Create fake realm cmd] *************************************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:22 Friday 14 November 2025 08:53:21 -0500 (0:00:00.040) 0:00:02.161 ******* changed: [managed-node2] => { "changed": true, "checksum": "30318e4f54519605d60caa5bc62e429287b28973", "dest": "/tmp/lsr_a6djy2_9_ad_int_realm.py", "gid": 0, "group": "root", "md5sum": "3ea3ed87c4442dcbe51dfff237c430ed", "mode": "0755", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 1867, "src": "/root/.ansible/tmp/ansible-tmp-1763128401.674985-7922-209647188625174/source", "state": "file", "uid": 0 } TASK [Check if /etc/sssd exists] *********************************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:28 Friday 14 November 2025 08:53:22 -0500 (0:00:00.737) 0:00:02.899 ******* ok: [managed-node2] => { "changed": false, "stat": { "atime": 1716968740.483, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1716968740.245, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 993, "gr_name": "sssd", "inode": 7060576, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0700", "mtime": 1716968740.245, "nlink": 4, "path": "/etc/sssd", "pw_name": "sssd", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 31, "uid": 996, "version": "3583498373", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": true } } TASK [Install sssd-common for /etc/sssd] *************************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:33 Friday 14 November 2025 08:53:22 -0500 (0:00:00.344) 0:00:03.243 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Remove realm cmd] ******************************************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:44 Friday 14 November 2025 08:53:22 -0500 (0:00:00.035) 0:00:03.278 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Remove sssd-common] ****************************************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:49 Friday 14 November 2025 08:53:22 -0500 (0:00:00.032) 0:00:03.311 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Test - Run the system role with bogus vars] ****************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:39 Friday 14 November 2025 08:53:22 -0500 (0:00:00.032) 0:00:03.343 ******* TASK [fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:3 Friday 14 November 2025 08:53:22 -0500 (0:00:00.081) 0:00:03.424 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing timesync if timesource is set] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:8 Friday 14 November 2025 08:53:22 -0500 (0:00:00.034) 0:00:03.459 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure time source is provided if managing timesync] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:15 Friday 14 November 2025 08:53:22 -0500 (0:00:00.033) 0:00:03.493 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing crypto policies if allow_rc4_crypto is set] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:25 Friday 14 November 2025 08:53:22 -0500 (0:00:00.033) 0:00:03.527 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure manage_crypt_policies is set with crypto_allow_rc4] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:30 Friday 14 November 2025 08:53:23 -0500 (0:00:00.033) 0:00:03.560 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure all required dns variables are provided] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:40 Friday 14 November 2025 08:53:23 -0500 (0:00:00.033) 0:00:03.594 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:49 Friday 14 November 2025 08:53:23 -0500 (0:00:00.039) 0:00:03.633 ******* included: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.ad_integration : Ensure ansible_facts used by role] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:2 Friday 14 November 2025 08:53:23 -0500 (0:00:00.023) 0:00:03.656 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Check if system is ostree] **** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Friday 14 November 2025 08:53:23 -0500 (0:00:00.034) 0:00:03.691 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set flag to indicate system is ostree] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:15 Friday 14 November 2025 08:53:23 -0500 (0:00:00.034) 0:00:03.725 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 Friday 14 November 2025 08:53:23 -0500 (0:00:00.034) 0:00:03.759 ******* skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure required packages are installed] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 Friday 14 November 2025 08:53:23 -0500 (0:00:00.053) 0:00:03.813 ******* changed: [managed-node2] => { "changed": true, "rc": 0, "results": [ "Installed: dejavu-sans-mono-fonts-2.35-7.el8.noarch", "Installed: gsettings-desktop-schemas-3.32.0-6.el8.x86_64", "Installed: realmd-0.17.1-2.el8.x86_64", "Installed: PackageKit-glib-1.1.12-7.el8.x86_64", "Installed: abattis-cantarell-fonts-0.0.25-6.el8.noarch", "Installed: libproxy-0.4.15-5.2.el8.x86_64", "Installed: gdk-pixbuf2-2.36.12-5.el8.x86_64", "Installed: libstemmer-0-10.585svn.el8.x86_64", "Installed: glib-networking-2.56.1-1.1.el8.x86_64", "Installed: libmodman-2.0.1-17.el8.x86_64", "Installed: json-glib-1.4.4-1.el8.x86_64", "Installed: libappstream-glib-0.7.14-3.el8.x86_64", "Installed: fontpackages-filesystem-1.44-22.el8.noarch", "Installed: libsoup-2.62.3-5.el8.x86_64", "Installed: PackageKit-1.1.12-7.el8.x86_64", "Installed: dejavu-fonts-common-2.35-7.el8.noarch" ] } TASK [fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 Friday 14 November 2025 08:53:39 -0500 (0:00:16.661) 0:00:20.475 ******* changed: [managed-node2] => (item=realmd) => { "ansible_loop_var": "item", "changed": true, "enabled": true, "item": "realmd", "name": "realmd", "state": "started", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "sysinit.target basic.target systemd-journald.socket system.slice dbus.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.freedesktop.realmd", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Realm and Domain Configuration", "DevicePolicy": "auto", "Documentation": "man:realm(8) man:realmd.conf(5)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/libexec/realmd ; argv[]=/usr/libexec/realmd ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/realmd.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "realmd.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "realmd.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target dbus.socket system.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "static", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf] **** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 Friday 14 November 2025 08:53:40 -0500 (0:00:00.827) 0:00:21.302 ******* NOTIFIED HANDLER fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services for managed-node2 changed: [managed-node2] => { "changed": true, "checksum": "7e0c9eddf5cee60f782f39e0f445b043ab4bcb61", "dest": "/etc/realmd.conf", "gid": 0, "group": "root", "md5sum": "59e15d6f22a95d67b152af5a634072a8", "mode": "0400", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 181, "src": "/root/.ansible/tmp/ansible-tmp-1763128420.8186328-8387-3185224540111/source", "state": "file", "uid": 0 } RUNNING HANDLER [fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml:3 Friday 14 November 2025 08:53:41 -0500 (0:00:00.666) 0:00:21.969 ******* skipping: [managed-node2] => (item=realmd) => { "ansible_loop_var": "item", "changed": false, "item": "realmd", "skip_reason": "Conditional result was False" } META: ran handlers TASK [Add AD server to existing network connection for DNS] ******************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:79 Friday 14 November 2025 08:53:41 -0500 (0:00:00.033) 0:00:22.002 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Manage timesync] ********************************************************* task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:93 Friday 14 November 2025 08:53:41 -0500 (0:00:00.017) 0:00:22.020 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Manage crypto policies] ************************************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:102 Friday 14 November 2025 08:53:41 -0500 (0:00:00.014) 0:00:22.034 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable crypto policy allowing RC4 encryption] **************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:114 Friday 14 November 2025 08:53:41 -0500 (0:00:00.012) 0:00:22.047 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists] ****** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:130 Friday 14 November 2025 08:53:41 -0500 (0:00:00.015) 0:00:22.063 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:135 Friday 14 November 2025 08:53:41 -0500 (0:00:00.032) 0:00:22.095 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set variables we will need for merging - 1] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:142 Friday 14 November 2025 08:53:41 -0500 (0:00:00.014) 0:00:22.109 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set variables we will need for merging - 2] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:147 Friday 14 November 2025 08:53:41 -0500 (0:00:00.017) 0:00:22.127 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Check if we are already joined to a domain] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:153 Friday 14 November 2025 08:53:41 -0500 (0:00:00.014) 0:00:22.141 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Leave existing joined domain] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:161 Friday 14 November 2025 08:53:41 -0500 (0:00:00.013) 0:00:22.155 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm leave] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:174 Friday 14 November 2025 08:53:41 -0500 (0:00:00.014) 0:00:22.170 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Remove duplicate sections] **** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:179 Friday 14 November 2025 08:53:41 -0500 (0:00:00.014) 0:00:22.184 ******* TASK [fedora.linux_system_roles.ad_integration : Build Command - Join to a specific Domain Controller] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:191 Friday 14 November 2025 08:53:41 -0500 (0:00:00.015) 0:00:22.199 ******* skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Build Join Command - Perform discovery-based realm join operation] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:205 Friday 14 November 2025 08:53:41 -0500 (0:00:00.015) 0:00:22.214 ******* ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Show the join command for debug] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:219 Friday 14 November 2025 08:53:41 -0500 (0:00:00.020) 0:00:22.234 ******* skipping: [managed-node2] => {} TASK [fedora.linux_system_roles.ad_integration : Run realm join command] ******* task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:231 Friday 14 November 2025 08:53:41 -0500 (0:00:00.014) 0:00:22.249 ******* changed: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm join] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:258 Friday 14 November 2025 08:53:42 -0500 (0:00:00.437) 0:00:22.687 ******* ok: [managed-node2] => { "changed": false, "stat": { "atime": 1763128422.0915127, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "049911c7517fba993eeb39dc494de8bf33faa685", "ctime": 1763128422.0905128, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 7074497, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1763128422.0905128, "nlink": 1, "path": "/etc/sssd/sssd.conf", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 87, "uid": 0, "version": "2036297279", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists after realm join] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:263 Friday 14 November 2025 08:53:42 -0500 (0:00:00.338) 0:00:23.025 ******* ok: [managed-node2] => { "changed": false, "content": "W2RvbWFpbi9keW5kbnMtc2FtcGxlLXJlYWxtLmNvbV0KYWRfZG9tYWluID0gZHluZG5zLXNhbXBsZS1yZWFsbS5jb20KaWRfcHJvdmlkZXIgPSBhZAoK", "encoding": "base64", "source": "/etc/sssd/sssd.conf" } TASK [fedora.linux_system_roles.ad_integration : Consolidate options from duplicate sections] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:269 Friday 14 November 2025 08:53:42 -0500 (0:00:00.470) 0:00:23.496 ******* TASK [fedora.linux_system_roles.ad_integration : Configure SSSD settings] ****** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:284 Friday 14 November 2025 08:53:42 -0500 (0:00:00.022) 0:00:23.519 ******* TASK [fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates] *** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 Friday 14 November 2025 08:53:42 -0500 (0:00:00.021) 0:00:23.541 ******* An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_update', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_update", "value": "True" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128423.0822506-8700-122057919944772/AnsiballZ_ini_file.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128423.0822506-8700-122057919944772/AnsiballZ_ini_file.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128423.0822506-8700-122057919944772/AnsiballZ_ini_file.py", line 40, in invoke_module runpy.run_module(mod_name='ansible_collections.community.general.plugins.modules.ini_file', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_vwdzkw_w/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.13.145 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_ttl', 'value': '3600'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_ttl", "value": "3600" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128423.4935732-8700-84194011263495/AnsiballZ_ini_file.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128423.4935732-8700-84194011263495/AnsiballZ_ini_file.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128423.4935732-8700-84194011263495/AnsiballZ_ini_file.py", line 40, in invoke_module runpy.run_module(mod_name='ansible_collections.community.general.plugins.modules.ini_file', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_5k5o_exf/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.13.145 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_iface', 'value': 'TESTING'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_iface", "value": "TESTING" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128423.8182707-8700-81155193780810/AnsiballZ_ini_file.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128423.8182707-8700-81155193780810/AnsiballZ_ini_file.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128423.8182707-8700-81155193780810/AnsiballZ_ini_file.py", line 40, in invoke_module runpy.run_module(mod_name='ansible_collections.community.general.plugins.modules.ini_file', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_ydqhuxtc/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.13.145 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_refresh_interval', 'value': '86400'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_refresh_interval", "value": "86400" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128424.1470592-8700-42486695920279/AnsiballZ_ini_file.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128424.1470592-8700-42486695920279/AnsiballZ_ini_file.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128424.1470592-8700-42486695920279/AnsiballZ_ini_file.py", line 40, in invoke_module runpy.run_module(mod_name='ansible_collections.community.general.plugins.modules.ini_file', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_cdhtmew6/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.13.145 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_update_ptr', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_update_ptr", "value": "True" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128424.49457-8700-227494237728137/AnsiballZ_ini_file.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128424.49457-8700-227494237728137/AnsiballZ_ini_file.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128424.49457-8700-227494237728137/AnsiballZ_ini_file.py", line 40, in invoke_module runpy.run_module(mod_name='ansible_collections.community.general.plugins.modules.ini_file', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_40dsuulf/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.13.145 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_force_tcp', 'value': 'False'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_force_tcp", "value": "False" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128424.8600998-8700-162676588544237/AnsiballZ_ini_file.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128424.8600998-8700-162676588544237/AnsiballZ_ini_file.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128424.8600998-8700-162676588544237/AnsiballZ_ini_file.py", line 40, in invoke_module runpy.run_module(mod_name='ansible_collections.community.general.plugins.modules.ini_file', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_ikkzl_aj/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.13.145 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_auth', 'value': 'GSS-TSIG'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_auth", "value": "GSS-TSIG" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128425.2043989-8700-71639236344164/AnsiballZ_ini_file.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128425.2043989-8700-71639236344164/AnsiballZ_ini_file.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128425.2043989-8700-71639236344164/AnsiballZ_ini_file.py", line 40, in invoke_module runpy.run_module(mod_name='ansible_collections.community.general.plugins.modules.ini_file', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_eikki5w8/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.13.145 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_server', 'value': '127.0.0.1'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_server", "value": "127.0.0.1" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128425.5424995-8700-81002884879490/AnsiballZ_ini_file.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128425.5424995-8700-81002884879490/AnsiballZ_ini_file.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128425.5424995-8700-81002884879490/AnsiballZ_ini_file.py", line 40, in invoke_module runpy.run_module(mod_name='ansible_collections.community.general.plugins.modules.ini_file', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_tfm6k5lp/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.13.145 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'ad_hostname', 'value': 'managed-node2.dyndns-sample-realm.com'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "ad_hostname", "value": "managed-node2.dyndns-sample-realm.com" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1763128425.866098-8700-256045423223468/AnsiballZ_ini_file.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1763128425.866098-8700-256045423223468/AnsiballZ_ini_file.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1763128425.866098-8700-256045423223468/AnsiballZ_ini_file.py", line 40, in invoke_module runpy.run_module(mod_name='ansible_collections.community.general.plugins.modules.ini_file', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_xmb0nubo/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.13.145 closed. TASK [Cleanup fake realm] ****************************************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:144 Friday 14 November 2025 08:53:46 -0500 (0:00:03.159) 0:00:26.700 ******* included: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml for managed-node2 TASK [Get role variables] ****************************************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:6 Friday 14 November 2025 08:53:46 -0500 (0:00:00.033) 0:00:26.734 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create a temp file for fake realm cmd] *********************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:12 Friday 14 November 2025 08:53:46 -0500 (0:00:00.014) 0:00:26.749 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set realm cmd variable for remainder of test] **************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:18 Friday 14 November 2025 08:53:46 -0500 (0:00:00.014) 0:00:26.764 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create fake realm cmd] *************************************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:22 Friday 14 November 2025 08:53:46 -0500 (0:00:00.015) 0:00:26.779 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if /etc/sssd exists] *********************************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:28 Friday 14 November 2025 08:53:46 -0500 (0:00:00.014) 0:00:26.793 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Install sssd-common for /etc/sssd] *************************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:33 Friday 14 November 2025 08:53:46 -0500 (0:00:00.014) 0:00:26.807 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Remove realm cmd] ******************************************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:44 Friday 14 November 2025 08:53:46 -0500 (0:00:00.013) 0:00:26.821 ******* changed: [managed-node2] => { "changed": true, "path": "/tmp/lsr_a6djy2_9_ad_int_realm.py", "state": "absent" } TASK [Remove sssd-common] ****************************************************** task path: /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:49 Friday 14 November 2025 08:53:46 -0500 (0:00:00.498) 0:00:27.319 ******* skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } PLAY RECAP ********************************************************************* managed-node2 : ok=18 changed=7 unreachable=0 failed=1 skipped=39 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.9.27", "end_time": "2025-11-14T13:53:46.155912+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_update", "value": "True" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:43.000397+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.9.27", "end_time": "2025-11-14T13:53:46.155951+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_ttl", "value": "3600" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:43.000397+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.9.27", "end_time": "2025-11-14T13:53:46.155962+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_iface", "value": "TESTING" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:43.000397+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.9.27", "end_time": "2025-11-14T13:53:46.155971+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_refresh_interval", "value": "86400" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:43.000397+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.9.27", "end_time": "2025-11-14T13:53:46.155979+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_update_ptr", "value": "True" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:43.000397+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.9.27", "end_time": "2025-11-14T13:53:46.155987+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_force_tcp", "value": "False" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:43.000397+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.9.27", "end_time": "2025-11-14T13:53:46.155994+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_auth", "value": "GSS-TSIG" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:43.000397+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.9.27", "end_time": "2025-11-14T13:53:46.156001+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_server", "value": "127.0.0.1" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:43.000397+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.9.27", "end_time": "2025-11-14T13:53:46.156009+00:00Z", "host": "managed-node2", "loop_item": { "key": "ad_hostname", "value": "managed-node2.dyndns-sample-realm.com" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-14T13:53:43.000397+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Friday 14 November 2025 08:53:46 -0500 (0:00:00.025) 0:00:27.344 ******* =============================================================================== fedora.linux_system_roles.ad_integration : Ensure required packages are installed -- 16.66s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates --- 3.16s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 Gathering Facts --------------------------------------------------------- 1.00s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:20 fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started --- 0.83s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 Create fake realm cmd --------------------------------------------------- 0.74s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:22 fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf ---- 0.67s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 Remove realm cmd -------------------------------------------------------- 0.50s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:44 fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists after realm join --- 0.47s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:263 fedora.linux_system_roles.ad_integration : Check if system is ostree ---- 0.46s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 fedora.linux_system_roles.ad_integration : Run realm join command ------- 0.44s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:231 Create a temp file for fake realm cmd ----------------------------------- 0.42s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:12 Check if /etc/sssd exists ----------------------------------------------- 0.34s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:28 fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm join --- 0.34s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:258 Test - Run the system role with bogus vars ------------------------------ 0.08s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:39 fedora.linux_system_roles.ad_integration : Set platform/version specific variables --- 0.06s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 fedora.linux_system_roles.ad_integration : Set platform/version specific variables --- 0.05s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 Get role variables ------------------------------------------------------ 0.05s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:6 Setup fake realm -------------------------------------------------------- 0.04s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:33 Set realm cmd variable for remainder of test ---------------------------- 0.04s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:18 fedora.linux_system_roles.ad_integration : Ensure all required dns variables are provided --- 0.04s /tmp/collections-dKI/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:40 -- Logs begin at Fri 2025-11-14 08:49:52 EST, end at Fri 2025-11-14 08:53:47 EST. -- Nov 14 08:53:19 managed-node2 sshd[7081]: Accepted publickey for root from 10.31.13.54 port 47502 ssh2: ECDSA SHA256:kznx7FjEdh4pEP6ygoS38nkINPwsfu3yq6zT9ROv7ag Nov 14 08:53:19 managed-node2 systemd[1]: Started Session 8 of user root. -- Subject: Unit session-8.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-8.scope has finished starting up. -- -- The start-up result is done. Nov 14 08:53:19 managed-node2 systemd-logind[598]: New session 8 of user root. -- Subject: A new session 8 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 8 has been created for the user root. -- -- The leading process of the session is 7081. Nov 14 08:53:19 managed-node2 sshd[7081]: pam_unix(sshd:session): session opened for user root by (uid=0) Nov 14 08:53:20 managed-node2 platform-python[7226]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=* fact_path=/etc/ansible/facts.d Nov 14 08:53:20 managed-node2 platform-python[7374]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 14 08:53:21 managed-node2 platform-python[7497]: ansible-tempfile Invoked with prefix=lsr_ suffix=_ad_int_realm.py state=file path=None Nov 14 08:53:21 managed-node2 platform-python[7620]: ansible-stat Invoked with path=/tmp/lsr_a6djy2_9_ad_int_realm.py follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 14 08:53:22 managed-node2 platform-python[7721]: ansible-copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1763128401.674985-7922-209647188625174/source dest=/tmp/lsr_a6djy2_9_ad_int_realm.py mode=0755 follow=False _original_basename=fake_realm.py.j2 checksum=30318e4f54519605d60caa5bc62e429287b28973 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None regexp=None delimiter=None Nov 14 08:53:22 managed-node2 platform-python[7846]: ansible-stat Invoked with path=/etc/sssd follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 14 08:53:23 managed-node2 platform-python[7971]: ansible-dnf Invoked with name=['realmd', 'PackageKit'] state=present allow_downgrade=False autoremove=False bugfix=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True lock_timeout=30 conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 14 08:53:38 managed-node2 dbus-daemon[587]: [system] Reloaded configuration Nov 14 08:53:38 managed-node2 dbus-daemon[587]: [system] Reloaded configuration Nov 14 08:53:38 managed-node2 dbus-daemon[587]: [system] Reloaded configuration Nov 14 08:53:38 managed-node2 dbus-daemon[587]: [system] Reloaded configuration Nov 14 08:53:38 managed-node2 systemd[1]: Reloading. Nov 14 08:53:38 managed-node2 polkitd[939]: Reloading rules Nov 14 08:53:38 managed-node2 polkitd[939]: Collecting garbage unconditionally... Nov 14 08:53:38 managed-node2 polkitd[939]: Loading rules from directory /etc/polkit-1/rules.d Nov 14 08:53:38 managed-node2 polkitd[939]: Loading rules from directory /usr/share/polkit-1/rules.d Nov 14 08:53:38 managed-node2 polkitd[939]: Finished loading, compiling and executing 3 rules Nov 14 08:53:38 managed-node2 polkitd[939]: Reloading rules Nov 14 08:53:38 managed-node2 polkitd[939]: Collecting garbage unconditionally... Nov 14 08:53:38 managed-node2 polkitd[939]: Loading rules from directory /etc/polkit-1/rules.d Nov 14 08:53:38 managed-node2 polkitd[939]: Loading rules from directory /usr/share/polkit-1/rules.d Nov 14 08:53:38 managed-node2 polkitd[939]: Finished loading, compiling and executing 3 rules Nov 14 08:53:39 managed-node2 dbus-daemon[587]: [system] Reloaded configuration Nov 14 08:53:39 managed-node2 dbus-daemon[587]: [system] Reloaded configuration Nov 14 08:53:39 managed-node2 dbus-daemon[587]: [system] Reloaded configuration Nov 14 08:53:39 managed-node2 dbus-daemon[587]: [system] Reloaded configuration Nov 14 08:53:39 managed-node2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. -- Subject: Unit run-r58da512c1ee040089919c14d9dc311a6.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit run-r58da512c1ee040089919c14d9dc311a6.service has finished starting up. -- -- The start-up result is done. Nov 14 08:53:39 managed-node2 systemd[1]: Reloading. Nov 14 08:53:39 managed-node2 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Nov 14 08:53:39 managed-node2 systemd[1]: Starting man-db-cache-update.service... -- Subject: Unit man-db-cache-update.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has begun starting up. Nov 14 08:53:40 managed-node2 systemd[1]: man-db-cache-update.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit man-db-cache-update.service has successfully entered the 'dead' state. Nov 14 08:53:40 managed-node2 systemd[1]: Started man-db-cache-update.service. -- Subject: Unit man-db-cache-update.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has finished starting up. -- -- The start-up result is done. Nov 14 08:53:40 managed-node2 systemd[1]: run-r58da512c1ee040089919c14d9dc311a6.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit run-r58da512c1ee040089919c14d9dc311a6.service has successfully entered the 'dead' state. Nov 14 08:53:40 managed-node2 platform-python[8581]: ansible-systemd Invoked with name=realmd state=started enabled=True daemon_reload=False daemon_reexec=False no_block=False force=None masked=None user=None scope=None Nov 14 08:53:40 managed-node2 systemd[1]: Starting Realm and Domain Configuration... -- Subject: Unit realmd.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit realmd.service has begun starting up. Nov 14 08:53:40 managed-node2 realmd[8589]: Loaded settings from: /usr/lib/realmd/realmd-defaults.conf /usr/lib/realmd/realmd-distro.conf Nov 14 08:53:40 managed-node2 realmd[8589]: holding daemon: startup Nov 14 08:53:40 managed-node2 realmd[8589]: starting service Nov 14 08:53:40 managed-node2 realmd[8589]: connected to bus Nov 14 08:53:40 managed-node2 realmd[8589]: released daemon: startup Nov 14 08:53:40 managed-node2 realmd[8589]: claimed name on bus: org.freedesktop.realmd Nov 14 08:53:40 managed-node2 systemd[1]: Started Realm and Domain Configuration. -- Subject: Unit realmd.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit realmd.service has finished starting up. -- -- The start-up result is done. Nov 14 08:53:41 managed-node2 platform-python[8714]: ansible-stat Invoked with path=/etc/realmd.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 14 08:53:41 managed-node2 platform-python[8813]: ansible-copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1763128420.8186328-8387-3185224540111/source dest=/etc/realmd.conf backup=True mode=0400 follow=False _original_basename=realmd.conf.j2 checksum=7e0c9eddf5cee60f782f39e0f445b043ab4bcb61 force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None regexp=None delimiter=None Nov 14 08:53:42 managed-node2 platform-python[9062]: ansible-stat Invoked with path=/etc/sssd/sssd.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 14 08:53:42 managed-node2 platform-python[9187]: ansible-slurp Invoked with path=/etc/sssd/sssd.conf src=/etc/sssd/sssd.conf Nov 14 08:53:46 managed-node2 platform-python[10417]: ansible-file Invoked with path=/tmp/lsr_a6djy2_9_ad_int_realm.py state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None content=NOT_LOGGING_PARAMETER backup=None remote_src=None regexp=None delimiter=None directory_mode=None Nov 14 08:53:46 managed-node2 sshd[10438]: Accepted publickey for root from 10.31.13.54 port 47894 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Nov 14 08:53:46 managed-node2 systemd-logind[598]: New session 9 of user root. -- Subject: A new session 9 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 9 has been created for the user root. -- -- The leading process of the session is 10438. Nov 14 08:53:46 managed-node2 systemd[1]: Started Session 9 of user root. -- Subject: Unit session-9.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-9.scope has finished starting up. -- -- The start-up result is done. Nov 14 08:53:46 managed-node2 sshd[10438]: pam_unix(sshd:session): session opened for user root by (uid=0) Nov 14 08:53:47 managed-node2 sshd[10441]: Received disconnect from 10.31.13.54 port 47894:11: disconnected by user Nov 14 08:53:47 managed-node2 sshd[10441]: Disconnected from user root 10.31.13.54 port 47894 Nov 14 08:53:47 managed-node2 sshd[10438]: pam_unix(sshd:session): session closed for user root Nov 14 08:53:47 managed-node2 systemd[1]: session-9.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-9.scope has successfully entered the 'dead' state. Nov 14 08:53:47 managed-node2 systemd-logind[598]: Session 9 logged out. Waiting for processes to exit. Nov 14 08:53:47 managed-node2 systemd-logind[598]: Removed session 9. -- Subject: Session 9 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 9 has been terminated. Nov 14 08:53:47 managed-node2 sshd[10462]: Accepted publickey for root from 10.31.13.54 port 47908 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Nov 14 08:53:47 managed-node2 systemd[1]: Started Session 10 of user root. -- Subject: Unit session-10.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-10.scope has finished starting up. -- -- The start-up result is done. Nov 14 08:53:47 managed-node2 systemd-logind[598]: New session 10 of user root. -- Subject: A new session 10 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 10 has been created for the user root. -- -- The leading process of the session is 10462. Nov 14 08:53:47 managed-node2 sshd[10462]: pam_unix(sshd:session): session opened for user root by (uid=0)