[WARNING]: Collection community.general does not support Ansible version 2.16.14 [WARNING]: Could not match supplied host pattern, ignoring: ad ansible-playbook [core 2.16.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-mDb executable location = /usr/local/bin/ansible-playbook python version = 3.12.1 (main, Feb 21 2024, 14:18:26) [GCC 8.5.0 20210514 (Red Hat 8.5.0-21)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_dyndns.yml ***************************************************** 1 plays in /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml PLAY [Ensure that the role configures dynamic dns] ***************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:20 Saturday 08 November 2025 07:21:09 -0500 (0:00:00.028) 0:00:00.028 ***** ok: [managed-node2] TASK [Setup fake realm] ******************************************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:33 Saturday 08 November 2025 07:21:10 -0500 (0:00:01.187) 0:00:01.215 ***** included: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml for managed-node2 TASK [Get role variables] ****************************************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:6 Saturday 08 November 2025 07:21:10 -0500 (0:00:00.044) 0:00:01.260 ***** TASK [fedora.linux_system_roles.ad_integration : Ensure ansible_facts used by role] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:2 Saturday 08 November 2025 07:21:11 -0500 (0:00:00.028) 0:00:01.288 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Check if system is ostree] **** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Saturday 08 November 2025 07:21:11 -0500 (0:00:00.032) 0:00:01.321 ***** ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.ad_integration : Set flag to indicate system is ostree] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:15 Saturday 08 November 2025 07:21:11 -0500 (0:00:00.414) 0:00:01.735 ***** ok: [managed-node2] => { "ansible_facts": { "__ad_integration_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 Saturday 08 November 2025 07:21:11 -0500 (0:00:00.022) 0:00:01.757 ***** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Create a temp file for fake realm cmd] *********************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:12 Saturday 08 November 2025 07:21:11 -0500 (0:00:00.037) 0:00:01.794 ***** changed: [managed-node2] => { "changed": true, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/lsr_p_0e6cet_ad_int_realm.py", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Set realm cmd variable for remainder of test] **************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:18 Saturday 08 November 2025 07:21:11 -0500 (0:00:00.411) 0:00:02.206 ***** ok: [managed-node2] => { "ansible_facts": { "__ad_integration_realm_cmd": "/tmp/lsr_p_0e6cet_ad_int_realm.py" }, "changed": false } TASK [Create fake realm cmd] *************************************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:22 Saturday 08 November 2025 07:21:11 -0500 (0:00:00.017) 0:00:02.224 ***** changed: [managed-node2] => { "changed": true, "checksum": "30318e4f54519605d60caa5bc62e429287b28973", "dest": "/tmp/lsr_p_0e6cet_ad_int_realm.py", "gid": 0, "group": "root", "md5sum": "3ea3ed87c4442dcbe51dfff237c430ed", "mode": "0755", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 1867, "src": "/root/.ansible/tmp/ansible-tmp-1762604471.999868-8063-4204816513503/source", "state": "file", "uid": 0 } TASK [Check if /etc/sssd exists] *********************************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:28 Saturday 08 November 2025 07:21:12 -0500 (0:00:00.758) 0:00:02.983 ***** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1716968740.483, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1716968740.245, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 993, "gr_name": "sssd", "inode": 7060576, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0700", "mtime": 1716968740.245, "nlink": 4, "path": "/etc/sssd", "pw_name": "sssd", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 31, "uid": 996, "version": "3583498373", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": true } } TASK [Install sssd-common for /etc/sssd] *************************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:33 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.330) 0:00:03.314 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "not __sssd_dir_stat.stat.exists", "skip_reason": "Conditional result was False" } TASK [Remove realm cmd] ******************************************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:44 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.014) 0:00:03.328 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"cleanup\"", "skip_reason": "Conditional result was False" } TASK [Remove sssd-common] ****************************************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:49 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.012) 0:00:03.341 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"cleanup\"", "skip_reason": "Conditional result was False" } TASK [Test - Run the system role with bogus vars] ****************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:39 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.011) 0:00:03.352 ***** TASK [fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:3 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.056) 0:00:03.408 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "not ad_integration_realm", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing timesync if timesource is set] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:8 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.012) 0:00:03.421 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_timesync_source is not none", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure time source is provided if managing timesync] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:15 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.016) 0:00:03.437 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_timesync | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Assume managing crypto policies if allow_rc4_crypto is set] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:25 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.029) 0:00:03.467 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure manage_crypt_policies is set with crypto_allow_rc4] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:30 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.029) 0:00:03.496 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Ensure all required dns variables are provided] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:40 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.030) 0:00:03.527 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_dns | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:49 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.028) 0:00:03.555 ***** included: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.ad_integration : Ensure ansible_facts used by role] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:2 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.020) 0:00:03.576 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Check if system is ostree] **** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.031) 0:00:03.608 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set flag to indicate system is ostree] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:15 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.024) 0:00:03.632 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "not __ad_integration_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Set platform/version specific variables] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:19 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.015) 0:00:03.648 ***** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.ad_integration : Ensure required packages are installed] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 Saturday 08 November 2025 07:21:13 -0500 (0:00:00.030) 0:00:03.678 ***** changed: [managed-node2] => { "changed": true, "rc": 0, "results": [ "Installed: dejavu-sans-mono-fonts-2.35-7.el8.noarch", "Installed: gsettings-desktop-schemas-3.32.0-6.el8.x86_64", "Installed: realmd-0.17.1-2.el8.x86_64", "Installed: PackageKit-glib-1.1.12-7.el8.x86_64", "Installed: abattis-cantarell-fonts-0.0.25-6.el8.noarch", "Installed: libproxy-0.4.15-5.2.el8.x86_64", "Installed: gdk-pixbuf2-2.36.12-5.el8.x86_64", "Installed: libstemmer-0-10.585svn.el8.x86_64", "Installed: glib-networking-2.56.1-1.1.el8.x86_64", "Installed: libmodman-2.0.1-17.el8.x86_64", "Installed: json-glib-1.4.4-1.el8.x86_64", "Installed: libappstream-glib-0.7.14-3.el8.x86_64", "Installed: fontpackages-filesystem-1.44-22.el8.noarch", "Installed: libsoup-2.62.3-5.el8.x86_64", "Installed: PackageKit-1.1.12-7.el8.x86_64", "Installed: dejavu-fonts-common-2.35-7.el8.noarch" ] } lsrpackages: PackageKit realmd TASK [fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 Saturday 08 November 2025 07:21:30 -0500 (0:00:16.730) 0:00:20.409 ***** changed: [managed-node2] => (item=realmd) => { "ansible_loop_var": "item", "changed": true, "enabled": true, "item": "realmd", "name": "realmd", "state": "started", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dbus.socket system.slice systemd-journald.socket sysinit.target basic.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.freedesktop.realmd", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Realm and Domain Configuration", "DevicePolicy": "auto", "Documentation": "man:realm(8) man:realmd.conf(5)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/libexec/realmd ; argv[]=/usr/libexec/realmd ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/realmd.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "realmd.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "realmd.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "dbus.socket sysinit.target system.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "static", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf] **** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 Saturday 08 November 2025 07:21:30 -0500 (0:00:00.821) 0:00:21.230 ***** Notification for handler Handler for ad_integration to restart services has been saved. changed: [managed-node2] => { "changed": true, "checksum": "7e0c9eddf5cee60f782f39e0f445b043ab4bcb61", "dest": "/etc/realmd.conf", "gid": 0, "group": "root", "md5sum": "59e15d6f22a95d67b152af5a634072a8", "mode": "0400", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 181, "src": "/root/.ansible/tmp/ansible-tmp-1762604491.0090132-8332-94561797436793/source", "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.ad_integration : Flush handlers] *************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:75 Saturday 08 November 2025 07:21:31 -0500 (0:00:00.718) 0:00:21.948 ***** NOTIFIED HANDLER fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services for managed-node2 META: triggered running handlers for managed-node2 RUNNING HANDLER [fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml:3 Saturday 08 November 2025 07:21:31 -0500 (0:00:00.004) 0:00:21.952 ***** skipping: [managed-node2] => (item=realmd) => { "ansible_loop_var": "item", "changed": false, "false_condition": "not __ad_integration_test_sssd_config_only | default(false)", "item": "realmd", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Add AD server to existing network connection for DNS] ******************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:79 Saturday 08 November 2025 07:21:31 -0500 (0:00:00.042) 0:00:21.995 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_dns | bool", "skip_reason": "Conditional result was False" } TASK [Manage timesync] ********************************************************* task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:93 Saturday 08 November 2025 07:21:31 -0500 (0:00:00.034) 0:00:22.030 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_timesync | bool", "skip_reason": "Conditional result was False" } TASK [Manage crypto policies] ************************************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:102 Saturday 08 November 2025 07:21:31 -0500 (0:00:00.030) 0:00:22.060 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_manage_crypto_policies | bool", "skip_reason": "Conditional result was False" } TASK [Enable crypto policy allowing RC4 encryption] **************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:114 Saturday 08 November 2025 07:21:31 -0500 (0:00:00.029) 0:00:22.090 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_allow_rc4_crypto | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists] ****** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:130 Saturday 08 November 2025 07:21:31 -0500 (0:00:00.031) 0:00:22.121 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_sssd_merge_duplicate_sections | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:135 Saturday 08 November 2025 07:21:31 -0500 (0:00:00.030) 0:00:22.151 ***** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set variables we will need for merging - 1] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:142 Saturday 08 November 2025 07:21:31 -0500 (0:00:00.030) 0:00:22.182 ***** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Set variables we will need for merging - 2] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:147 Saturday 08 November 2025 07:21:31 -0500 (0:00:00.032) 0:00:22.214 ***** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Check if we are already joined to a domain] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:153 Saturday 08 November 2025 07:21:31 -0500 (0:00:00.035) 0:00:22.250 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_force_rejoin | bool or __ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Leave existing joined domain] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:161 Saturday 08 November 2025 07:21:32 -0500 (0:00:00.051) 0:00:22.301 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "ad_integration_force_rejoin | bool or __ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm leave] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:174 Saturday 08 November 2025 07:21:32 -0500 (0:00:00.046) 0:00:22.348 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "__ad_integration_has_duplicates | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.ad_integration : Remove duplicate sections] **** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:179 Saturday 08 November 2025 07:21:32 -0500 (0:00:00.039) 0:00:22.388 ***** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Build Command - Join to a specific Domain Controller] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:191 Saturday 08 November 2025 07:21:32 -0500 (0:00:00.031) 0:00:22.420 ***** skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Build Join Command - Perform discovery-based realm join operation] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:205 Saturday 08 November 2025 07:21:32 -0500 (0:00:00.015) 0:00:22.436 ***** ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.ad_integration : Show the join command for debug] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:219 Saturday 08 November 2025 07:21:32 -0500 (0:00:00.036) 0:00:22.472 ***** skipping: [managed-node2] => { "false_condition": "ad_integration_join_to_dc == __ad_integration_sample_dc or ad_integration_realm == __ad_integration_sample_realm or ansible_check_mode" } TASK [fedora.linux_system_roles.ad_integration : Run realm join command] ******* task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:231 Saturday 08 November 2025 07:21:32 -0500 (0:00:00.012) 0:00:22.484 ***** changed: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } TASK [fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm join] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:258 Saturday 08 November 2025 07:21:32 -0500 (0:00:00.468) 0:00:22.953 ***** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1762604492.633212, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "049911c7517fba993eeb39dc494de8bf33faa685", "ctime": 1762604492.632212, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 7074497, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1762604492.632212, "nlink": 1, "path": "/etc/sssd/sssd.conf", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 87, "uid": 0, "version": "2036297279", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists after realm join] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:263 Saturday 08 November 2025 07:21:33 -0500 (0:00:00.393) 0:00:23.346 ***** ok: [managed-node2] => { "changed": false, "content": "W2RvbWFpbi9keW5kbnMtc2FtcGxlLXJlYWxtLmNvbV0KYWRfZG9tYWluID0gZHluZG5zLXNhbXBsZS1yZWFsbS5jb20KaWRfcHJvdmlkZXIgPSBhZAoK", "encoding": "base64", "source": "/etc/sssd/sssd.conf" } TASK [fedora.linux_system_roles.ad_integration : Consolidate options from duplicate sections] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:269 Saturday 08 November 2025 07:21:33 -0500 (0:00:00.502) 0:00:23.849 ***** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Configure SSSD settings] ****** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:284 Saturday 08 November 2025 07:21:33 -0500 (0:00:00.035) 0:00:23.884 ***** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates] *** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 Saturday 08 November 2025 07:21:33 -0500 (0:00:00.010) 0:00:23.895 ***** An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_update', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_update", "value": "True" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1762604493.7235315-8663-35844824359272/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1762604493.7235315-8663-35844824359272/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1762604493.7235315-8663-35844824359272/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_ybxa43__/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.42.234 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_ttl', 'value': '3600'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_ttl", "value": "3600" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1762604494.1459117-8663-81009301563327/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1762604494.1459117-8663-81009301563327/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1762604494.1459117-8663-81009301563327/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_60ts2uxv/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.42.234 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_iface', 'value': 'TESTING'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_iface", "value": "TESTING" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1762604494.4870727-8663-47428596993773/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1762604494.4870727-8663-47428596993773/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1762604494.4870727-8663-47428596993773/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_2k93sbye/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.42.234 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_refresh_interval', 'value': '86400'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_refresh_interval", "value": "86400" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1762604494.8358583-8663-180710489763131/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1762604494.8358583-8663-180710489763131/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1762604494.8358583-8663-180710489763131/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_cxrmmyjw/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.42.234 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_update_ptr', 'value': 'True'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_update_ptr", "value": "True" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1762604495.1947737-8663-65707285433730/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1762604495.1947737-8663-65707285433730/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1762604495.1947737-8663-65707285433730/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_3wpweg9l/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.42.234 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_force_tcp', 'value': 'False'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_force_tcp", "value": "False" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1762604495.5236177-8663-79440805320974/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1762604495.5236177-8663-79440805320974/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1762604495.5236177-8663-79440805320974/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_xvhyam4u/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.42.234 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_auth', 'value': 'GSS-TSIG'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_auth", "value": "GSS-TSIG" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1762604495.8539987-8663-266341615149570/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1762604495.8539987-8663-266341615149570/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1762604495.8539987-8663-266341615149570/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_pgvelikd/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.42.234 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'dyndns_server', 'value': '127.0.0.1'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "dyndns_server", "value": "127.0.0.1" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1762604496.181563-8663-37180447222746/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1762604496.181563-8663-37180447222746/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1762604496.181563-8663-37180447222746/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_vbc7efct/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.42.234 closed. An exception occurred during task execution. To see the full traceback, use -vvv. The error was: SyntaxError: future feature annotations is not defined failed: [managed-node2] (item={'key': 'ad_hostname', 'value': 'managed-node2.dyndns-sample-realm.com'}) => { "ansible_loop_var": "item", "changed": false, "item": { "key": "ad_hostname", "value": "managed-node2.dyndns-sample-realm.com" }, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1762604496.5262618-8663-102737692282804/AnsiballZ_ini_file.py", line 107, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1762604496.5262618-8663-102737692282804/AnsiballZ_ini_file.py", line 99, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1762604496.5262618-8663-102737692282804/AnsiballZ_ini_file.py", line 48, in invoke_module run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 201, in run_module mod_name, mod_spec, code = _get_module_details(mod_name) File "/usr/lib64/python3.6/runpy.py", line 128, in _get_module_details spec = importlib.util.find_spec(mod_name) File "/usr/lib64/python3.6/importlib/util.py", line 89, in find_spec return _find_spec(fullname, parent.__path__) File "", line 894, in _find_spec File "", line 1157, in find_spec File "", line 1131, in _get_spec File "", line 1112, in _legacy_get_spec File "", line 441, in spec_from_loader File "", line 544, in spec_from_file_location File "/tmp/ansible_community.general.ini_file_payload_xjp7510t/ansible_community.general.ini_file_payload.zip/ansible_collections/community/general/plugins/modules/ini_file.py", line 10 SyntaxError: future feature annotations is not defined MODULE_STDERR: Shared connection to 10.31.42.234 closed. TASK [Cleanup fake realm] ****************************************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:144 Saturday 08 November 2025 07:21:36 -0500 (0:00:03.188) 0:00:27.083 ***** included: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml for managed-node2 TASK [Get role variables] ****************************************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:6 Saturday 08 November 2025 07:21:36 -0500 (0:00:00.082) 0:00:27.166 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Create a temp file for fake realm cmd] *********************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:12 Saturday 08 November 2025 07:21:36 -0500 (0:00:00.018) 0:00:27.184 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Set realm cmd variable for remainder of test] **************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:18 Saturday 08 November 2025 07:21:36 -0500 (0:00:00.019) 0:00:27.203 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Create fake realm cmd] *************************************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:22 Saturday 08 November 2025 07:21:36 -0500 (0:00:00.017) 0:00:27.221 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Check if /etc/sssd exists] *********************************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:28 Saturday 08 November 2025 07:21:36 -0500 (0:00:00.019) 0:00:27.240 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Install sssd-common for /etc/sssd] *************************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:33 Saturday 08 November 2025 07:21:36 -0500 (0:00:00.017) 0:00:27.257 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "__manage_fake_realm == \"setup\"", "skip_reason": "Conditional result was False" } TASK [Remove realm cmd] ******************************************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:44 Saturday 08 November 2025 07:21:37 -0500 (0:00:00.018) 0:00:27.276 ***** changed: [managed-node2] => { "changed": true, "path": "/tmp/lsr_p_0e6cet_ad_int_realm.py", "state": "absent" } TASK [Remove sssd-common] ****************************************************** task path: /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:49 Saturday 08 November 2025 07:21:37 -0500 (0:00:00.511) 0:00:27.787 ***** skipping: [managed-node2] => { "changed": false, "false_condition": "__installed_sssd_package is changed", "skip_reason": "Conditional result was False" } PLAY RECAP ********************************************************************* managed-node2 : ok=18 changed=7 unreachable=0 failed=1 skipped=39 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.16.14", "end_time": "2025-11-08T12:21:36.816888+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_update", "value": "True" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-08T12:21:33.634611+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-08T12:21:36.816929+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_ttl", "value": "3600" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-08T12:21:33.634611+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-08T12:21:36.816942+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_iface", "value": "TESTING" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-08T12:21:33.634611+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-08T12:21:36.816951+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_refresh_interval", "value": "86400" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-08T12:21:33.634611+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-08T12:21:36.816960+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_update_ptr", "value": "True" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-08T12:21:33.634611+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-08T12:21:36.816968+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_force_tcp", "value": "False" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-08T12:21:33.634611+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-08T12:21:36.816978+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_auth", "value": "GSS-TSIG" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-08T12:21:33.634611+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-08T12:21:36.816987+00:00Z", "host": "managed-node2", "loop_item": { "key": "dyndns_server", "value": "127.0.0.1" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-08T12:21:33.634611+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" }, { "ansible_version": "2.16.14", "end_time": "2025-11-08T12:21:36.816995+00:00Z", "host": "managed-node2", "loop_item": { "key": "ad_hostname", "value": "managed-node2.dyndns-sample-realm.com" }, "loop_label": "", "loop_var": "item", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2025-11-08T12:21:33.634611+00:00Z", "task_name": "Configure dynamic DNS updates", "task_path": "/tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Saturday 08 November 2025 07:21:37 -0500 (0:00:00.016) 0:00:27.803 ***** =============================================================================== fedora.linux_system_roles.ad_integration : Ensure required packages are installed -- 16.73s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:52 fedora.linux_system_roles.ad_integration : Configure dynamic DNS updates --- 3.19s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:298 Gathering Facts --------------------------------------------------------- 1.19s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:20 fedora.linux_system_roles.ad_integration : Ensure required services are enabled and started --- 0.82s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:60 Create fake realm cmd --------------------------------------------------- 0.76s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:22 fedora.linux_system_roles.ad_integration : Generate /etc/realmd.conf ---- 0.72s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:67 Remove realm cmd -------------------------------------------------------- 0.51s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:44 fedora.linux_system_roles.ad_integration : Grab sssd.conf if it exists after realm join --- 0.50s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:263 fedora.linux_system_roles.ad_integration : Run realm join command ------- 0.47s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:231 fedora.linux_system_roles.ad_integration : Check if system is ostree ---- 0.41s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/set_vars.yml:10 Create a temp file for fake realm cmd ----------------------------------- 0.41s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:12 fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm join --- 0.39s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:258 Check if /etc/sssd exists ----------------------------------------------- 0.33s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tasks/manage_fake_realm.yml:28 Cleanup fake realm ------------------------------------------------------ 0.08s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:144 Test - Run the system role with bogus vars ------------------------------ 0.06s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:39 fedora.linux_system_roles.ad_integration : Check if we are already joined to a domain --- 0.05s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:153 fedora.linux_system_roles.ad_integration : Leave existing joined domain --- 0.05s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:161 Setup fake realm -------------------------------------------------------- 0.04s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_dyndns.yml:33 fedora.linux_system_roles.ad_integration : Handler for ad_integration to restart services --- 0.04s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml:3 fedora.linux_system_roles.ad_integration : See if sssd.conf exists after realm leave --- 0.04s /tmp/collections-mDb/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:174 -- Logs begin at Sat 2025-11-08 07:18:05 EST, end at Sat 2025-11-08 07:21:37 EST. -- Nov 08 07:21:09 managed-node2 sshd[7091]: Accepted publickey for root from 10.31.14.188 port 39706 ssh2: ECDSA SHA256:OWWvhoBihqepl5iqhwR/qLdlRT5CbzBajKqOWqo63NE Nov 08 07:21:09 managed-node2 systemd[1]: Started Session 8 of user root. -- Subject: Unit session-8.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-8.scope has finished starting up. -- -- The start-up result is done. Nov 08 07:21:09 managed-node2 systemd-logind[610]: New session 8 of user root. -- Subject: A new session 8 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 8 has been created for the user root. -- -- The leading process of the session is 7091. Nov 08 07:21:09 managed-node2 sshd[7091]: pam_unix(sshd:session): session opened for user root by (uid=0) Nov 08 07:21:10 managed-node2 platform-python[7236]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 08 07:21:11 managed-node2 platform-python[7388]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 08 07:21:11 managed-node2 platform-python[7511]: ansible-tempfile Invoked with prefix=lsr_ suffix=_ad_int_realm.py state=file path=None Nov 08 07:21:12 managed-node2 platform-python[7634]: ansible-ansible.legacy.stat Invoked with path=/tmp/lsr_p_0e6cet_ad_int_realm.py follow=False get_checksum=True checksum_algorithm=sha1 get_mime=True get_attributes=True Nov 08 07:21:12 managed-node2 platform-python[7735]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1762604471.999868-8063-4204816513503/source dest=/tmp/lsr_p_0e6cet_ad_int_realm.py mode=0755 follow=False _original_basename=fake_realm.py.j2 checksum=30318e4f54519605d60caa5bc62e429287b28973 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 08 07:21:13 managed-node2 platform-python[7860]: ansible-stat Invoked with path=/etc/sssd follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 08 07:21:13 managed-node2 platform-python[7985]: ansible-ansible.legacy.dnf Invoked with name=['realmd', 'PackageKit'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 08 07:21:28 managed-node2 dbus-daemon[597]: [system] Reloaded configuration Nov 08 07:21:28 managed-node2 dbus-daemon[597]: [system] Reloaded configuration Nov 08 07:21:28 managed-node2 dbus-daemon[597]: [system] Reloaded configuration Nov 08 07:21:28 managed-node2 dbus-daemon[597]: [system] Reloaded configuration Nov 08 07:21:29 managed-node2 systemd[1]: Reloading. Nov 08 07:21:29 managed-node2 polkitd[945]: Reloading rules Nov 08 07:21:29 managed-node2 polkitd[945]: Collecting garbage unconditionally... Nov 08 07:21:29 managed-node2 polkitd[945]: Loading rules from directory /etc/polkit-1/rules.d Nov 08 07:21:29 managed-node2 polkitd[945]: Loading rules from directory /usr/share/polkit-1/rules.d Nov 08 07:21:29 managed-node2 polkitd[945]: Finished loading, compiling and executing 3 rules Nov 08 07:21:29 managed-node2 polkitd[945]: Reloading rules Nov 08 07:21:29 managed-node2 polkitd[945]: Collecting garbage unconditionally... Nov 08 07:21:29 managed-node2 polkitd[945]: Loading rules from directory /etc/polkit-1/rules.d Nov 08 07:21:29 managed-node2 polkitd[945]: Loading rules from directory /usr/share/polkit-1/rules.d Nov 08 07:21:29 managed-node2 polkitd[945]: Finished loading, compiling and executing 3 rules Nov 08 07:21:29 managed-node2 dbus-daemon[597]: [system] Reloaded configuration Nov 08 07:21:29 managed-node2 dbus-daemon[597]: [system] Reloaded configuration Nov 08 07:21:29 managed-node2 dbus-daemon[597]: [system] Reloaded configuration Nov 08 07:21:29 managed-node2 dbus-daemon[597]: [system] Reloaded configuration Nov 08 07:21:29 managed-node2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. -- Subject: Unit run-r2bbaecba065e4cfd85dc07fc1165ccaf.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit run-r2bbaecba065e4cfd85dc07fc1165ccaf.service has finished starting up. -- -- The start-up result is done. Nov 08 07:21:29 managed-node2 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Nov 08 07:21:29 managed-node2 systemd[1]: Starting man-db-cache-update.service... -- Subject: Unit man-db-cache-update.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has begun starting up. Nov 08 07:21:29 managed-node2 systemd[1]: Reloading. Nov 08 07:21:30 managed-node2 systemd[1]: man-db-cache-update.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit man-db-cache-update.service has successfully entered the 'dead' state. Nov 08 07:21:30 managed-node2 systemd[1]: Started man-db-cache-update.service. -- Subject: Unit man-db-cache-update.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has finished starting up. -- -- The start-up result is done. Nov 08 07:21:30 managed-node2 systemd[1]: run-r2bbaecba065e4cfd85dc07fc1165ccaf.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit run-r2bbaecba065e4cfd85dc07fc1165ccaf.service has successfully entered the 'dead' state. Nov 08 07:21:30 managed-node2 platform-python[8597]: ansible-ansible.legacy.systemd Invoked with name=realmd state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 08 07:21:30 managed-node2 systemd[1]: Starting Realm and Domain Configuration... -- Subject: Unit realmd.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit realmd.service has begun starting up. Nov 08 07:21:30 managed-node2 realmd[8605]: Loaded settings from: /usr/lib/realmd/realmd-defaults.conf /usr/lib/realmd/realmd-distro.conf Nov 08 07:21:30 managed-node2 realmd[8605]: holding daemon: startup Nov 08 07:21:30 managed-node2 realmd[8605]: starting service Nov 08 07:21:30 managed-node2 realmd[8605]: connected to bus Nov 08 07:21:30 managed-node2 realmd[8605]: released daemon: startup Nov 08 07:21:30 managed-node2 systemd[1]: Started Realm and Domain Configuration. -- Subject: Unit realmd.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit realmd.service has finished starting up. -- -- The start-up result is done. Nov 08 07:21:30 managed-node2 realmd[8605]: claimed name on bus: org.freedesktop.realmd Nov 08 07:21:31 managed-node2 platform-python[8730]: ansible-ansible.legacy.stat Invoked with path=/etc/realmd.conf follow=False get_checksum=True checksum_algorithm=sha1 get_mime=True get_attributes=True Nov 08 07:21:31 managed-node2 platform-python[8829]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1762604491.0090132-8332-94561797436793/source dest=/etc/realmd.conf backup=True mode=0400 follow=False _original_basename=realmd.conf.j2 checksum=7e0c9eddf5cee60f782f39e0f445b043ab4bcb61 force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 08 07:21:33 managed-node2 platform-python[9078]: ansible-stat Invoked with path=/etc/sssd/sssd.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 08 07:21:33 managed-node2 platform-python[9203]: ansible-slurp Invoked with path=/etc/sssd/sssd.conf src=/etc/sssd/sssd.conf Nov 08 07:21:37 managed-node2 platform-python[10433]: ansible-file Invoked with path=/tmp/lsr_p_0e6cet_ad_int_realm.py state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 08 07:21:37 managed-node2 sshd[10454]: Accepted publickey for root from 10.31.14.188 port 45472 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Nov 08 07:21:37 managed-node2 systemd[1]: Started Session 9 of user root. -- Subject: Unit session-9.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-9.scope has finished starting up. -- -- The start-up result is done. Nov 08 07:21:37 managed-node2 systemd-logind[610]: New session 9 of user root. -- Subject: A new session 9 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 9 has been created for the user root. -- -- The leading process of the session is 10454. Nov 08 07:21:37 managed-node2 sshd[10454]: pam_unix(sshd:session): session opened for user root by (uid=0) Nov 08 07:21:37 managed-node2 sshd[10457]: Received disconnect from 10.31.14.188 port 45472:11: disconnected by user Nov 08 07:21:37 managed-node2 sshd[10457]: Disconnected from user root 10.31.14.188 port 45472 Nov 08 07:21:37 managed-node2 sshd[10454]: pam_unix(sshd:session): session closed for user root Nov 08 07:21:37 managed-node2 systemd[1]: session-9.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-9.scope has successfully entered the 'dead' state. Nov 08 07:21:37 managed-node2 systemd-logind[610]: Session 9 logged out. Waiting for processes to exit. Nov 08 07:21:37 managed-node2 systemd-logind[610]: Removed session 9. -- Subject: Session 9 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 9 has been terminated. Nov 08 07:21:37 managed-node2 sshd[10478]: Accepted publickey for root from 10.31.14.188 port 45478 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Nov 08 07:21:37 managed-node2 systemd[1]: Started Session 10 of user root. -- Subject: Unit session-10.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-10.scope has finished starting up. -- -- The start-up result is done. Nov 08 07:21:37 managed-node2 systemd-logind[610]: New session 10 of user root. -- Subject: A new session 10 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 10 has been created for the user root. -- -- The leading process of the session is 10478. Nov 08 07:21:37 managed-node2 sshd[10478]: pam_unix(sshd:session): session opened for user root by (uid=0)