ansible-playbook 2.9.27
  config file = /etc/ansible/ansible.cfg
  configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python2.7/site-packages/ansible
  executable location = /usr/bin/ansible-playbook
  python version = 2.7.5 (default, Nov 14 2023, 16:14:06) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)]
Using /etc/ansible/ansible.cfg as config file
[WARNING]: running playbook inside collection fedora.linux_system_roles
statically imported: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/restore_services_state.yml
statically imported: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/get_services_state.yml
Skipping callback 'actionable', as we already have a stdout callback.
Skipping callback 'counter_enabled', as we already have a stdout callback.
Skipping callback 'debug', as we already have a stdout callback.
Skipping callback 'dense', as we already have a stdout callback.
Skipping callback 'dense', as we already have a stdout callback.
Skipping callback 'full_skip', as we already have a stdout callback.
Skipping callback 'json', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'null', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.
Skipping callback 'selective', as we already have a stdout callback.
Skipping callback 'skippy', as we already have a stdout callback.
Skipping callback 'stderr', as we already have a stdout callback.
Skipping callback 'unixy', as we already have a stdout callback.
Skipping callback 'yaml', as we already have a stdout callback.

PLAYBOOK: tests_verify_from_elasticsearch.yml **********************************
2 plays in /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/tests_verify_from_elasticsearch.yml

PLAY [all] *********************************************************************
META: ran handlers

TASK [Include vault variables] *************************************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/tests_verify_from_elasticsearch.yml:5
Saturday 17 August 2024  06:22:55 -0400 (0:00:00.143)       0:00:00.143 ******* 
ok: [managed_node1] => {
    "ansible_facts": {
        "pcptest_pw": {
            "__ansible_vault": "$ANSIBLE_VAULT;1.1;AES256\n65343431623161346664373330646165636437656265656632613961363839303132393064663934\n3137396633373562393466633037356533326566343338350a386238333034336162333932313162\n62643937336534356131376134303463306466316433366636643562633637376336653034646334\n3063663466333735390a333330366461386166633233373133326237323663333831653232646566\n3363\n"
        }
    }, 
    "ansible_included_var_files": [
        "/tmp/metrics-wKA/tests/vars/vault-variables.yml"
    ], 
    "changed": false
}
META: ran handlers
META: ran handlers

PLAY [Test import from Elasticsearch] ******************************************

TASK [Gathering Facts] *********************************************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/tests_verify_from_elasticsearch.yml:9
Saturday 17 August 2024  06:22:55 -0400 (0:00:00.021)       0:00:00.165 ******* 
ok: [managed_node1]
META: end_host conditional evaluated to false, continuing execution for managed_node1

TASK [Get initial state of services] *******************************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/get_services_state.yml:3
Saturday 17 August 2024  06:22:56 -0400 (0:00:01.469)       0:00:01.634 ******* 
ok: [managed_node1] => {
    "ansible_facts": {
        "services": {
            "NetworkManager-dispatcher.service": {
                "name": "NetworkManager-dispatcher.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "enabled"
            }, 
            "NetworkManager-wait-online.service": {
                "name": "NetworkManager-wait-online.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "NetworkManager.service": {
                "name": "NetworkManager.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "arp-ethers.service": {
                "name": "arp-ethers.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "auditd.service": {
                "name": "auditd.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "auth-rpcgss-module.service": {
                "name": "auth-rpcgss-module.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "autovt@.service": {
                "name": "autovt@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "enabled"
            }, 
            "blk-availability.service": {
                "name": "blk-availability.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "brandbot.service": {
                "name": "brandbot.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "chrony-dnssrv@.service": {
                "name": "chrony-dnssrv@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "chrony-wait.service": {
                "name": "chrony-wait.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "chronyd.service": {
                "name": "chronyd.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "cloud-config.service": {
                "name": "cloud-config.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "cloud-final.service": {
                "name": "cloud-final.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "cloud-init-local.service": {
                "name": "cloud-init-local.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "cloud-init.service": {
                "name": "cloud-init.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "console-getty.service": {
                "name": "console-getty.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "console-shell.service": {
                "name": "console-shell.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "container-getty@.service": {
                "name": "container-getty@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "cpupower.service": {
                "name": "cpupower.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "crond.service": {
                "name": "crond.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "dbus-org.freedesktop.hostname1.service": {
                "name": "dbus-org.freedesktop.hostname1.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "dbus-org.freedesktop.import1.service": {
                "name": "dbus-org.freedesktop.import1.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "dbus-org.freedesktop.locale1.service": {
                "name": "dbus-org.freedesktop.locale1.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "dbus-org.freedesktop.login1.service": {
                "name": "dbus-org.freedesktop.login1.service", 
                "source": "systemd", 
                "state": "active", 
                "status": "static"
            }, 
            "dbus-org.freedesktop.machine1.service": {
                "name": "dbus-org.freedesktop.machine1.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "dbus-org.freedesktop.nm-dispatcher.service": {
                "name": "dbus-org.freedesktop.nm-dispatcher.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "enabled"
            }, 
            "dbus-org.freedesktop.timedate1.service": {
                "name": "dbus-org.freedesktop.timedate1.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "dbus.service": {
                "name": "dbus.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "static"
            }, 
            "debug-shell.service": {
                "name": "debug-shell.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "dracut-cmdline.service": {
                "name": "dracut-cmdline.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "dracut-initqueue.service": {
                "name": "dracut-initqueue.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "dracut-mount.service": {
                "name": "dracut-mount.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "dracut-pre-mount.service": {
                "name": "dracut-pre-mount.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "dracut-pre-pivot.service": {
                "name": "dracut-pre-pivot.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "dracut-pre-trigger.service": {
                "name": "dracut-pre-trigger.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "dracut-pre-udev.service": {
                "name": "dracut-pre-udev.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "dracut-shutdown.service": {
                "name": "dracut-shutdown.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "ebtables.service": {
                "name": "ebtables.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "emergency.service": {
                "name": "emergency.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "firewalld.service": {
                "name": "firewalld.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "fstrim.service": {
                "name": "fstrim.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "getty@.service": {
                "name": "getty@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "enabled"
            }, 
            "getty@tty1.service": {
                "name": "getty@tty1.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "unknown"
            }, 
            "gssproxy.service": {
                "name": "gssproxy.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "disabled"
            }, 
            "halt-local.service": {
                "name": "halt-local.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "initrd-cleanup.service": {
                "name": "initrd-cleanup.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "initrd-parse-etc.service": {
                "name": "initrd-parse-etc.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "initrd-switch-root.service": {
                "name": "initrd-switch-root.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "initrd-udevadm-cleanup-db.service": {
                "name": "initrd-udevadm-cleanup-db.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "iprdump.service": {
                "name": "iprdump.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "iprinit.service": {
                "name": "iprinit.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "iprupdate.service": {
                "name": "iprupdate.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "irqbalance.service": {
                "name": "irqbalance.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "kdump.service": {
                "name": "kdump.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "kmod-static-nodes.service": {
                "name": "kmod-static-nodes.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "messagebus.service": {
                "name": "messagebus.service", 
                "source": "systemd", 
                "state": "active", 
                "status": "static"
            }, 
            "microcode.service": {
                "name": "microcode.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "netconsole": {
                "name": "netconsole", 
                "source": "sysv", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "network": {
                "name": "network", 
                "source": "sysv", 
                "state": "running", 
                "status": "enabled"
            }, 
            "network.service": {
                "name": "network.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "unknown"
            }, 
            "nfs-blkmap.service": {
                "name": "nfs-blkmap.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "nfs-config.service": {
                "name": "nfs-config.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "nfs-idmap.service": {
                "name": "nfs-idmap.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "nfs-idmapd.service": {
                "name": "nfs-idmapd.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "nfs-lock.service": {
                "name": "nfs-lock.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "nfs-mountd.service": {
                "name": "nfs-mountd.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "nfs-rquotad.service": {
                "name": "nfs-rquotad.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "nfs-secure.service": {
                "name": "nfs-secure.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "nfs-server.service": {
                "name": "nfs-server.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "nfs-utils.service": {
                "name": "nfs-utils.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "nfs.service": {
                "name": "nfs.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "nfslock.service": {
                "name": "nfslock.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "plymouth-halt.service": {
                "name": "plymouth-halt.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "plymouth-kexec.service": {
                "name": "plymouth-kexec.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "plymouth-poweroff.service": {
                "name": "plymouth-poweroff.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "plymouth-quit-wait.service": {
                "name": "plymouth-quit-wait.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "plymouth-quit.service": {
                "name": "plymouth-quit.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "plymouth-read-write.service": {
                "name": "plymouth-read-write.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "plymouth-reboot.service": {
                "name": "plymouth-reboot.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "plymouth-start.service": {
                "name": "plymouth-start.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "plymouth-switch-root.service": {
                "name": "plymouth-switch-root.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "polkit.service": {
                "name": "polkit.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "static"
            }, 
            "postfix.service": {
                "name": "postfix.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "qemu-guest-agent.service": {
                "name": "qemu-guest-agent.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "enabled"
            }, 
            "quotaon.service": {
                "name": "quotaon.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "rc-local.service": {
                "name": "rc-local.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "rdisc.service": {
                "name": "rdisc.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "rescue.service": {
                "name": "rescue.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "restraintd.service": {
                "name": "restraintd.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "rhel-autorelabel-mark.service": {
                "name": "rhel-autorelabel-mark.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rhel-autorelabel.service": {
                "name": "rhel-autorelabel.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rhel-configure.service": {
                "name": "rhel-configure.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rhel-dmesg.service": {
                "name": "rhel-dmesg.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rhel-domainname.service": {
                "name": "rhel-domainname.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rhel-import-state.service": {
                "name": "rhel-import-state.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rhel-loadmodules.service": {
                "name": "rhel-loadmodules.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rhel-readonly.service": {
                "name": "rhel-readonly.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rngd.service": {
                "name": "rngd.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "rpc-gssd.service": {
                "name": "rpc-gssd.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "rpc-rquotad.service": {
                "name": "rpc-rquotad.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "rpc-statd-notify.service": {
                "name": "rpc-statd-notify.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "rpc-statd.service": {
                "name": "rpc-statd.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "rpcbind.service": {
                "name": "rpcbind.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "rpcgssd.service": {
                "name": "rpcgssd.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "rpcidmapd.service": {
                "name": "rpcidmapd.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "rsyncd.service": {
                "name": "rsyncd.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "rsyncd@.service": {
                "name": "rsyncd@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "rsyslog.service": {
                "name": "rsyslog.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "selinux-policy-migrate-local-changes@.service": {
                "name": "selinux-policy-migrate-local-changes@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "selinux-policy-migrate-local-changes@targeted.service": {
                "name": "selinux-policy-migrate-local-changes@targeted.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "unknown"
            }, 
            "serial-getty@.service": {
                "name": "serial-getty@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "disabled"
            }, 
            "serial-getty@ttyS0.service": {
                "name": "serial-getty@ttyS0.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "unknown"
            }, 
            "sshd-keygen.service": {
                "name": "sshd-keygen.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "sshd.service": {
                "name": "sshd.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "sshd@.service": {
                "name": "sshd@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "systemd-ask-password-console.service": {
                "name": "systemd-ask-password-console.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-ask-password-plymouth.service": {
                "name": "systemd-ask-password-plymouth.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-ask-password-wall.service": {
                "name": "systemd-ask-password-wall.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-backlight@.service": {
                "name": "systemd-backlight@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "systemd-binfmt.service": {
                "name": "systemd-binfmt.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-bootchart.service": {
                "name": "systemd-bootchart.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "systemd-firstboot.service": {
                "name": "systemd-firstboot.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-fsck-root.service": {
                "name": "systemd-fsck-root.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-fsck@.service": {
                "name": "systemd-fsck@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "systemd-halt.service": {
                "name": "systemd-halt.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-hibernate-resume@.service": {
                "name": "systemd-hibernate-resume@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "systemd-hibernate.service": {
                "name": "systemd-hibernate.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-hostnamed.service": {
                "name": "systemd-hostnamed.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-hwdb-update.service": {
                "name": "systemd-hwdb-update.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-hybrid-sleep.service": {
                "name": "systemd-hybrid-sleep.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-importd.service": {
                "name": "systemd-importd.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-initctl.service": {
                "name": "systemd-initctl.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-journal-catalog-update.service": {
                "name": "systemd-journal-catalog-update.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-journal-flush.service": {
                "name": "systemd-journal-flush.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-journald.service": {
                "name": "systemd-journald.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "static"
            }, 
            "systemd-kexec.service": {
                "name": "systemd-kexec.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-localed.service": {
                "name": "systemd-localed.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-logind.service": {
                "name": "systemd-logind.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "static"
            }, 
            "systemd-machine-id-commit.service": {
                "name": "systemd-machine-id-commit.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-machined.service": {
                "name": "systemd-machined.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-modules-load.service": {
                "name": "systemd-modules-load.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-nspawn@.service": {
                "name": "systemd-nspawn@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "disabled"
            }, 
            "systemd-poweroff.service": {
                "name": "systemd-poweroff.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-quotacheck.service": {
                "name": "systemd-quotacheck.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-random-seed.service": {
                "name": "systemd-random-seed.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-readahead-collect.service": {
                "name": "systemd-readahead-collect.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "systemd-readahead-done.service": {
                "name": "systemd-readahead-done.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "indirect"
            }, 
            "systemd-readahead-drop.service": {
                "name": "systemd-readahead-drop.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "enabled"
            }, 
            "systemd-readahead-replay.service": {
                "name": "systemd-readahead-replay.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "systemd-reboot.service": {
                "name": "systemd-reboot.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-remount-fs.service": {
                "name": "systemd-remount-fs.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-rfkill@.service": {
                "name": "systemd-rfkill@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "systemd-shutdownd.service": {
                "name": "systemd-shutdownd.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-suspend.service": {
                "name": "systemd-suspend.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-sysctl.service": {
                "name": "systemd-sysctl.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-timedated.service": {
                "name": "systemd-timedated.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-tmpfiles-clean.service": {
                "name": "systemd-tmpfiles-clean.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-tmpfiles-setup-dev.service": {
                "name": "systemd-tmpfiles-setup-dev.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-tmpfiles-setup.service": {
                "name": "systemd-tmpfiles-setup.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-udev-settle.service": {
                "name": "systemd-udev-settle.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-udev-trigger.service": {
                "name": "systemd-udev-trigger.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-udevd.service": {
                "name": "systemd-udevd.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "static"
            }, 
            "systemd-update-done.service": {
                "name": "systemd-update-done.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-update-utmp-runlevel.service": {
                "name": "systemd-update-utmp-runlevel.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-update-utmp.service": {
                "name": "systemd-update-utmp.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-user-sessions.service": {
                "name": "systemd-user-sessions.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-vconsole-setup.service": {
                "name": "systemd-vconsole-setup.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "teamd@.service": {
                "name": "teamd@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "tuned.service": {
                "name": "tuned.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "wpa_supplicant.service": {
                "name": "wpa_supplicant.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }
        }
    }, 
    "changed": false
}
META: ran handlers

TASK [Run the metrics role to configure Elasticsearch] *************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/tests_verify_from_elasticsearch.yml:25
Saturday 17 August 2024  06:22:58 -0400 (0:00:01.358)       0:00:02.993 ******* 

TASK [fedora.linux_system_roles.metrics : Ensure ansible_facts used by role] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/main.yml:3
Saturday 17 August 2024  06:22:58 -0400 (0:00:00.056)       0:00:03.049 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.metrics : Add Elasticsearch to metrics domain list] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/main.yml:8
Saturday 17 August 2024  06:22:58 -0400 (0:00:00.039)       0:00:03.089 ******* 
ok: [managed_node1] => {
    "ansible_facts": {
        "__metrics_domains": [
            "elasticsearch"
        ]
    }, 
    "changed": false
}

TASK [fedora.linux_system_roles.metrics : Add SQL Server to metrics domain list] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/main.yml:13
Saturday 17 August 2024  06:22:58 -0400 (0:00:00.050)       0:00:03.139 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.metrics : Add Postfix to metrics domain list] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/main.yml:18
Saturday 17 August 2024  06:22:58 -0400 (0:00:00.042)       0:00:03.182 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.metrics : Add bpftrace to metrics domain list] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/main.yml:23
Saturday 17 August 2024  06:22:58 -0400 (0:00:00.041)       0:00:03.223 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.metrics : Setup metrics access for roles] ******
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/main.yml:28
Saturday 17 August 2024  06:22:58 -0400 (0:00:00.040)       0:00:03.264 ******* 
ok: [managed_node1] => {
    "ansible_facts": {
        "__metrics_accounts": [
            {
                "saslpassword": "metrics", 
                "sasluser": "metrics", 
                "user": "metrics"
            }
        ]
    }, 
    "changed": false
}

TASK [Configure Elasticsearch metrics] *****************************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/main.yml:35
Saturday 17 August 2024  06:22:58 -0400 (0:00:00.032)       0:00:03.296 ******* 

TASK [fedora.linux_system_roles.private_metrics_subrole_elasticsearch : Set platform/version specific variables] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/tasks/main.yml:4
Saturday 17 August 2024  06:22:58 -0400 (0:00:00.113)       0:00:03.410 ******* 
ok: [managed_node1] => (item=/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/vars/default.yml) => {
    "ansible_facts": {
        "elasticsearch_metrics_provider": "pcp"
    }, 
    "ansible_included_var_files": [
        "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/vars/default.yml"
    ], 
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/vars/default.yml"
}
ok: [managed_node1] => (item=/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/vars/RedHat.yml) => {
    "ansible_facts": {
        "__elasticsearch_packages_pcp": [
            "pcp-pmda-elasticsearch"
        ], 
        "__elasticsearch_service_path": "/usr/lib/systemd/system"
    }, 
    "ansible_included_var_files": [
        "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/vars/RedHat.yml"
    ], 
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/vars/RedHat.yml"
}
skipping: [managed_node1] => (item=/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/vars/CentOS.yml)  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/vars/CentOS.yml", 
    "skip_reason": "Conditional result was False"
}
ok: [managed_node1] => (item=/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/vars/CentOS_7.yml) => {
    "ansible_facts": {
        "__elasticsearch_conf_dir": "/var/lib/pcp/pmdas/elasticsearch", 
        "__elasticsearch_packages_export_pcp": [
            "pcp-export-pcp2elasticsearch", 
            "pcp-system-tools"
        ]
    }, 
    "ansible_included_var_files": [
        "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/vars/CentOS_7.yml"
    ], 
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/vars/CentOS_7.yml"
}
skipping: [managed_node1] => (item=/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/vars/CentOS_7.9.yml)  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/vars/CentOS_7.9.yml", 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_elasticsearch : Check if system is ostree] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/tasks/main.yml:18
Saturday 17 August 2024  06:22:58 -0400 (0:00:00.196)       0:00:03.607 ******* 
ok: [managed_node1] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [fedora.linux_system_roles.private_metrics_subrole_elasticsearch : Set flag to indicate system is ostree] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/tasks/main.yml:23
Saturday 17 August 2024  06:22:59 -0400 (0:00:00.480)       0:00:04.087 ******* 
ok: [managed_node1] => {
    "ansible_facts": {
        "__ansible_pcp_is_ostree": false
    }, 
    "changed": false
}

TASK [fedora.linux_system_roles.private_metrics_subrole_elasticsearch : Establish Elasticsearch metrics package names] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/tasks/main.yml:27
Saturday 17 August 2024  06:22:59 -0400 (0:00:00.053)       0:00:04.140 ******* 
ok: [managed_node1] => {
    "ansible_facts": {
        "__elasticsearch_packages_extra": [
            "pcp-pmda-elasticsearch"
        ]
    }, 
    "changed": false
}

TASK [fedora.linux_system_roles.private_metrics_subrole_elasticsearch : Establish Elasticsearch metrics export package names] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/tasks/main.yml:34
Saturday 17 August 2024  06:22:59 -0400 (0:00:00.049)       0:00:04.190 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_elasticsearch : Install needed Elasticsearch metrics packages] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/tasks/main.yml:41
Saturday 17 August 2024  06:22:59 -0400 (0:00:00.046)       0:00:04.236 ******* 
changed: [managed_node1] => {
    "changed": true, 
    "changes": {
        "installed": [
            "pcp-pmda-elasticsearch"
        ]
    }, 
    "rc": 0, 
    "results": [
        "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * epel: dl.fedoraproject.org\n * epel-debuginfo: dl.fedoraproject.org\n * epel-source: dl.fedoraproject.org\nResolving Dependencies\n--> Running transaction check\n---> Package pcp-pmda-elasticsearch.x86_64 0:4.3.2-13.el7_9 will be installed\n--> Processing Dependency: python-pcp for package: pcp-pmda-elasticsearch-4.3.2-13.el7_9.x86_64\n--> Running transaction check\n---> Package python-pcp.x86_64 0:4.3.2-13.el7_9 will be installed\n--> Processing Dependency: pcp-libs = 4.3.2-13.el7_9 for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: pcp = 4.3.2-13.el7_9 for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: libpcp_pmda.so.3(PCP_PMDA_3.9)(64bit) for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: libpcp_pmda.so.3(PCP_PMDA_3.8)(64bit) for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: libpcp_pmda.so.3(PCP_PMDA_3.7)(64bit) for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: libpcp_pmda.so.3(PCP_PMDA_3.0)(64bit) for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: libpcp.so.3(PCP_3.6)(64bit) for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: libpcp.so.3(PCP_3.22)(64bit) for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: libpcp.so.3(PCP_3.21)(64bit) for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: libpcp.so.3(PCP_3.2)(64bit) for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: libpcp.so.3(PCP_3.16)(64bit) for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: libpcp.so.3(PCP_3.0)(64bit) for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: libpcp_pmda.so.3()(64bit) for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: libpcp_mmv.so.1()(64bit) for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: libpcp_import.so.1()(64bit) for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: libpcp_gui.so.2()(64bit) for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: libpcp.so.3()(64bit) for package: python-pcp-4.3.2-13.el7_9.x86_64\n--> Running transaction check\n---> Package pcp.x86_64 0:4.3.2-13.el7_9 will be installed\n--> Processing Dependency: pcp-selinux = 4.3.2-13.el7_9 for package: pcp-4.3.2-13.el7_9.x86_64\n---> Package pcp-libs.x86_64 0:4.3.2-13.el7_9 will be installed\n--> Processing Dependency: pcp-conf = 4.3.2-13.el7_9 for package: pcp-libs-4.3.2-13.el7_9.x86_64\n--> Running transaction check\n---> Package pcp-conf.x86_64 0:4.3.2-13.el7_9 will be installed\n---> Package pcp-selinux.x86_64 0:4.3.2-13.el7_9 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package                     Arch        Version             Repository    Size\n================================================================================\nInstalling:\n pcp-pmda-elasticsearch      x86_64      4.3.2-13.el7_9      updates       29 k\nInstalling for dependencies:\n pcp                         x86_64      4.3.2-13.el7_9      updates      1.0 M\n pcp-conf                    x86_64      4.3.2-13.el7_9      updates       37 k\n pcp-libs                    x86_64      4.3.2-13.el7_9      updates      467 k\n pcp-selinux                 x86_64      4.3.2-13.el7_9      updates       34 k\n python-pcp                  x86_64      4.3.2-13.el7_9      updates      143 k\n\nTransaction Summary\n================================================================================\nInstall  1 Package (+5 Dependent packages)\n\nTotal download size: 1.7 M\nInstalled size: 5.2 M\nDownloading packages:\n--------------------------------------------------------------------------------\nTotal                                              5.2 MB/s | 1.7 MB  00:00     \nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n  Installing : pcp-conf-4.3.2-13.el7_9.x86_64                               1/6 \n  Installing : pcp-libs-4.3.2-13.el7_9.x86_64                               2/6 \n  Installing : pcp-selinux-4.3.2-13.el7_9.x86_64                            3/6 \n  Installing : pcp-4.3.2-13.el7_9.x86_64                                    4/6 \n  Installing : python-pcp-4.3.2-13.el7_9.x86_64                             5/6 \n  Installing : pcp-pmda-elasticsearch-4.3.2-13.el7_9.x86_64                 6/6 \n  Verifying  : pcp-4.3.2-13.el7_9.x86_64                                    1/6 \n  Verifying  : python-pcp-4.3.2-13.el7_9.x86_64                             2/6 \n  Verifying  : pcp-selinux-4.3.2-13.el7_9.x86_64                            3/6 \n  Verifying  : pcp-pmda-elasticsearch-4.3.2-13.el7_9.x86_64                 4/6 \n  Verifying  : pcp-libs-4.3.2-13.el7_9.x86_64                               5/6 \n  Verifying  : pcp-conf-4.3.2-13.el7_9.x86_64                               6/6 \n\nInstalled:\n  pcp-pmda-elasticsearch.x86_64 0:4.3.2-13.el7_9                                \n\nDependency Installed:\n  pcp.x86_64 0:4.3.2-13.el7_9            pcp-conf.x86_64 0:4.3.2-13.el7_9       \n  pcp-libs.x86_64 0:4.3.2-13.el7_9       pcp-selinux.x86_64 0:4.3.2-13.el7_9    \n  python-pcp.x86_64 0:4.3.2-13.el7_9    \n\nComplete!\n"
    ]
}
lsrpackages: pcp-pmda-elasticsearch

TASK [fedora.linux_system_roles.private_metrics_subrole_elasticsearch : Ensure PCP Elasticsearch agent configuration directory exists] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/tasks/main.yml:48
Saturday 17 August 2024  06:23:25 -0400 (0:00:26.426)       0:00:30.663 ******* 
ok: [managed_node1] => {
    "changed": false, 
    "gid": 0, 
    "group": "root", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/var/lib/pcp/pmdas/elasticsearch", 
    "secontext": "system_u:object_r:pcp_var_lib_t:s0", 
    "size": 4096, 
    "state": "directory", 
    "uid": 0
}

TASK [fedora.linux_system_roles.private_metrics_subrole_elasticsearch : Ensure PCP Elasticsearch agent is configured] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/tasks/main.yml:55
Saturday 17 August 2024  06:23:26 -0400 (0:00:00.513)       0:00:31.176 ******* 
changed: [managed_node1] => {
    "changed": true, 
    "checksum": "13f91d28ea10d21516fb892b9c304eb8001fb026", 
    "dest": "/var/lib/pcp/pmdas/elasticsearch/elasticsearch.conf", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "5a155d222202413b5131bf79404124e4", 
    "mode": "0600", 
    "owner": "root", 
    "secontext": "system_u:object_r:pcp_var_lib_t:s0", 
    "size": 127, 
    "src": "/root/.ansible/tmp/ansible-tmp-1723890206.43-10528-182672214824311/source", 
    "state": "file", 
    "uid": 0
}

TASK [fedora.linux_system_roles.private_metrics_subrole_elasticsearch : Ensure correct service path for ostree systems] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/tasks/main.yml:65
Saturday 17 August 2024  06:23:27 -0400 (0:00:00.706)       0:00:31.882 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_elasticsearch : Ensure PCP Elasticsearch export service exists] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/tasks/main.yml:72
Saturday 17 August 2024  06:23:27 -0400 (0:00:00.039)       0:00:31.922 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_elasticsearch : Ensure PCP Elasticsearch export is running and enabled on boot] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/tasks/main.yml:81
Saturday 17 August 2024  06:23:27 -0400 (0:00:00.033)       0:00:31.955 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [Configure SQL Server metrics.] *******************************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/main.yml:50
Saturday 17 August 2024  06:23:27 -0400 (0:00:00.033)       0:00:31.988 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [Configure Postfix metrics.] **********************************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/main.yml:58
Saturday 17 August 2024  06:23:27 -0400 (0:00:00.028)       0:00:32.017 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [Setup bpftrace metrics.] *************************************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/main.yml:66
Saturday 17 August 2024  06:23:27 -0400 (0:00:00.028)       0:00:32.046 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [Setup metric querying service.] ******************************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/main.yml:75
Saturday 17 August 2024  06:23:27 -0400 (0:00:00.027)       0:00:32.074 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [Setup metric collection service.] ****************************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/main.yml:83
Saturday 17 August 2024  06:23:27 -0400 (0:00:00.028)       0:00:32.103 ******* 

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Set platform/version specific variables] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/main.yml:4
Saturday 17 August 2024  06:23:27 -0400 (0:00:00.089)       0:00:32.192 ******* 
ok: [managed_node1] => (item=/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/vars/default.yml) => {
    "ansible_facts": {}, 
    "ansible_included_var_files": [
        "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/vars/default.yml"
    ], 
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/vars/default.yml"
}
ok: [managed_node1] => (item=/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/vars/RedHat.yml) => {
    "ansible_facts": {
        "__pcp_pmcd_defaults_path": "/etc/sysconfig/pmcd", 
        "__pcp_pmlogger_defaults_path": "/etc/sysconfig/pmlogger", 
        "__pcp_pmlogger_timers_path": "/etc/sysconfig/pmlogger_timers", 
        "__pcp_pmproxy_defaults_path": "/etc/sysconfig/pmproxy"
    }, 
    "ansible_included_var_files": [
        "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/vars/RedHat.yml"
    ], 
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/vars/RedHat.yml"
}
skipping: [managed_node1] => (item=/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/vars/CentOS.yml)  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/vars/CentOS.yml", 
    "skip_reason": "Conditional result was False"
}
ok: [managed_node1] => (item=/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/vars/CentOS_7.yml) => {
    "ansible_facts": {
        "__pcp_packages_extra": [
            "pcp-zeroconf"
        ], 
        "__pcp_sasl_mechlist": "scram-sha-256", 
        "__pcp_sasl_packages": [
            "cyrus-sasl-lib", 
            "cyrus-sasl-scram"
        ]
    }, 
    "ansible_included_var_files": [
        "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/vars/CentOS_7.yml"
    ], 
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/vars/CentOS_7.yml"
}
skipping: [managed_node1] => (item=/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/vars/CentOS_7.9.yml)  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/vars/CentOS_7.9.yml", 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Check if system is ostree] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/main.yml:18
Saturday 17 August 2024  06:23:27 -0400 (0:00:00.128)       0:00:32.321 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Set flag to indicate system is ostree] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/main.yml:23
Saturday 17 August 2024  06:23:27 -0400 (0:00:00.029)       0:00:32.350 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Install Performance Co-Pilot packages] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/main.yml:27
Saturday 17 August 2024  06:23:27 -0400 (0:00:00.028)       0:00:32.379 ******* 
changed: [managed_node1] => {
    "changed": true, 
    "changes": {
        "installed": [
            "pcp-zeroconf"
        ]
    }, 
    "rc": 0, 
    "results": [
        "pcp-4.3.2-13.el7_9.x86_64 providing pcp is already installed", 
        "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * epel: d2lzkl7pfhq30w.cloudfront.net\n * epel-debuginfo: d2lzkl7pfhq30w.cloudfront.net\n * epel-source: d2lzkl7pfhq30w.cloudfront.net\nResolving Dependencies\n--> Running transaction check\n---> Package pcp-zeroconf.x86_64 0:4.3.2-13.el7_9 will be installed\n--> Processing Dependency: pcp-system-tools for package: pcp-zeroconf-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: pcp-pmda-nfsclient for package: pcp-zeroconf-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: pcp-pmda-dm for package: pcp-zeroconf-4.3.2-13.el7_9.x86_64\n--> Processing Dependency: pcp-doc for package: pcp-zeroconf-4.3.2-13.el7_9.x86_64\n--> Running transaction check\n---> Package pcp-doc.noarch 0:4.3.2-13.el7_9 will be installed\n---> Package pcp-pmda-dm.x86_64 0:4.3.2-13.el7_9 will be installed\n---> Package pcp-pmda-nfsclient.x86_64 0:4.3.2-13.el7_9 will be installed\n---> Package pcp-system-tools.x86_64 0:4.3.2-13.el7_9 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package                  Arch         Version              Repository     Size\n================================================================================\nInstalling:\n pcp-zeroconf             x86_64       4.3.2-13.el7_9       updates        32 k\nInstalling for dependencies:\n pcp-doc                  noarch       4.3.2-13.el7_9       updates       3.8 M\n pcp-pmda-dm              x86_64       4.3.2-13.el7_9       updates        51 k\n pcp-pmda-nfsclient       x86_64       4.3.2-13.el7_9       updates        32 k\n pcp-system-tools         x86_64       4.3.2-13.el7_9       updates       177 k\n\nTransaction Summary\n================================================================================\nInstall  1 Package (+4 Dependent packages)\n\nTotal download size: 4.1 M\nInstalled size: 11 M\nDownloading packages:\n--------------------------------------------------------------------------------\nTotal                                               13 MB/s | 4.1 MB  00:00     \nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n  Installing : pcp-system-tools-4.3.2-13.el7_9.x86_64                       1/5 \n  Installing : pcp-pmda-nfsclient-4.3.2-13.el7_9.x86_64                     2/5 \n  Installing : pcp-doc-4.3.2-13.el7_9.noarch                                3/5 \n  Installing : pcp-pmda-dm-4.3.2-13.el7_9.x86_64                            4/5 \n  Installing : pcp-zeroconf-4.3.2-13.el7_9.x86_64                           5/5 \n  Verifying  : pcp-pmda-dm-4.3.2-13.el7_9.x86_64                            1/5 \n  Verifying  : pcp-doc-4.3.2-13.el7_9.noarch                                2/5 \n  Verifying  : pcp-pmda-nfsclient-4.3.2-13.el7_9.x86_64                     3/5 \n  Verifying  : pcp-system-tools-4.3.2-13.el7_9.x86_64                       4/5 \n  Verifying  : pcp-zeroconf-4.3.2-13.el7_9.x86_64                           5/5 \n\nInstalled:\n  pcp-zeroconf.x86_64 0:4.3.2-13.el7_9                                          \n\nDependency Installed:\n  pcp-doc.noarch 0:4.3.2-13.el7_9                                               \n  pcp-pmda-dm.x86_64 0:4.3.2-13.el7_9                                           \n  pcp-pmda-nfsclient.x86_64 0:4.3.2-13.el7_9                                    \n  pcp-system-tools.x86_64 0:4.3.2-13.el7_9                                      \n\nComplete!\n"
    ]
}
lsrpackages: pcp pcp-zeroconf

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Install authentication packages] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/main.yml:33
Saturday 17 August 2024  06:23:39 -0400 (0:00:12.235)       0:00:44.615 ******* 
changed: [managed_node1] => {
    "changed": true, 
    "changes": {
        "installed": [
            "cyrus-sasl-scram"
        ]
    }, 
    "rc": 0, 
    "results": [
        "cyrus-sasl-lib-2.1.26-24.el7_9.x86_64 providing cyrus-sasl-lib is already installed", 
        "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * epel: dl.fedoraproject.org\n * epel-debuginfo: dl.fedoraproject.org\n * epel-source: dl.fedoraproject.org\nResolving Dependencies\n--> Running transaction check\n---> Package cyrus-sasl-scram.x86_64 0:2.1.26-24.el7_9 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package                Arch         Version                Repository     Size\n================================================================================\nInstalling:\n cyrus-sasl-scram       x86_64       2.1.26-24.el7_9        updates        43 k\n\nTransaction Summary\n================================================================================\nInstall  1 Package\n\nTotal download size: 43 k\nInstalled size: 40 k\nDownloading packages:\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n  Installing : cyrus-sasl-scram-2.1.26-24.el7_9.x86_64                      1/1 \n  Verifying  : cyrus-sasl-scram-2.1.26-24.el7_9.x86_64                      1/1 \n\nInstalled:\n  cyrus-sasl-scram.x86_64 0:2.1.26-24.el7_9                                     \n\nComplete!\n"
    ]
}
lsrpackages: cyrus-sasl-lib cyrus-sasl-scram

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Include pmcd] ****
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/main.yml:42
Saturday 17 August 2024  06:23:43 -0400 (0:00:03.330)       0:00:47.946 ******* 
included: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml for managed_node1

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : List optional metric collection agents to be enabled] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:4
Saturday 17 August 2024  06:23:43 -0400 (0:00:00.083)       0:00:48.029 ******* 
ok: [managed_node1] => (item=elasticsearch) => {}

MSG:

NeedInstall agent: elasticsearch from [u'elasticsearch']

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Extract metric collection configuration file content] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:9
Saturday 17 August 2024  06:23:43 -0400 (0:00:00.047)       0:00:48.077 ******* 
ok: [managed_node1] => {
    "changed": false, 
    "cmd": [
        "cat", 
        "/etc/pcp/pmcd/pmcd.conf"
    ], 
    "delta": "0:00:00.002925", 
    "end": "2024-08-17 06:23:43.704283", 
    "rc": 0, 
    "start": "2024-08-17 06:23:43.701358"
}

STDOUT:

# 
# Name  Id      IPC     IPC Params      File/Cmd
# Performance Metrics Domain Specifications
# This file is automatically generated during the build
root	1	pipe	binary		/var/lib/pcp/pmdas/root/pmdaroot
pmcd	2	dso	pmcd_init	/var/lib/pcp/pmdas/pmcd/pmda_pmcd.so
proc	3	pipe	binary		/var/lib/pcp/pmdas/proc/pmdaproc -d 3
pmproxy	4	dso	pmproxy_init	/var/lib/pcp/pmdas/mmv/pmda_mmv.so
xfs	11	pipe	binary		/var/lib/pcp/pmdas/xfs/pmdaxfs -d 11
linux	60	pipe	binary		/var/lib/pcp/pmdas/linux/pmdalinux
nfsclient	62	pipe	binary		python /var/lib/pcp/pmdas/nfsclient/pmdanfsclient.python 
mmv	70	dso	mmv_init	/var/lib/pcp/pmdas/mmv/pmda_mmv.so
kvm	95	pipe	binary		/var/lib/pcp/pmdas/kvm/pmdakvm -d 95
jbd2	122	dso	jbd2_init	/var/lib/pcp/pmdas/jbd2/pmda_jbd2.so
dm	129	pipe	binary 		/var/lib/pcp/pmdas/dm/pmdadm -d 129 

[access]
disallow ".*" : store;
disallow ":*" : store;
allow "local:*" : all;

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure optional metric collection agents are enabled] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:14
Saturday 17 August 2024  06:23:43 -0400 (0:00:00.485)       0:00:48.562 ******* 
changed: [managed_node1] => (item=elasticsearch) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "dest": "/var/lib/pcp/pmdas/elasticsearch/.NeedInstall", 
    "gid": 0, 
    "group": "root", 
    "item": "elasticsearch", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:pcp_var_lib_t:s0", 
    "size": 0, 
    "state": "file", 
    "uid": 0
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure explicit metric label path exists] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:23
Saturday 17 August 2024  06:23:44 -0400 (0:00:00.359)       0:00:48.922 ******* 
ok: [managed_node1] => {
    "changed": false, 
    "gid": 0, 
    "group": "root", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/etc/pcp/labels", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 4096, 
    "state": "directory", 
    "uid": 0
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure implicit metric label path exists] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:31
Saturday 17 August 2024  06:23:44 -0400 (0:00:00.383)       0:00:49.306 ******* 
changed: [managed_node1] => {
    "changed": true, 
    "gid": 0, 
    "group": "root", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/etc/pcp/labels/optional", 
    "secontext": "unconfined_u:object_r:etc_t:s0", 
    "size": 4096, 
    "state": "directory", 
    "uid": 0
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure any explicit metric labels are configured] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:39
Saturday 17 August 2024  06:23:44 -0400 (0:00:00.350)       0:00:49.657 ******* 
changed: [managed_node1] => {
    "changed": true, 
    "checksum": "5f36b2ea290645ee34d943220a14b54ee5ea5be5", 
    "dest": "/etc/pcp/labels/ansible-managed", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "8a80554c91d9fca8acb82f023de02f11", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 3, 
    "src": "/root/.ansible/tmp/ansible-tmp-1723890224.91-10974-212342487042925/source", 
    "state": "file", 
    "uid": 0
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure any implicit metric labels are configured] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:46
Saturday 17 August 2024  06:23:45 -0400 (0:00:00.669)       0:00:50.327 ******* 
changed: [managed_node1] => {
    "changed": true, 
    "checksum": "5f36b2ea290645ee34d943220a14b54ee5ea5be5", 
    "dest": "/etc/pcp/labels/optional/ansible-managed", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "8a80554c91d9fca8acb82f023de02f11", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 3, 
    "src": "/root/.ansible/tmp/ansible-tmp-1723890225.63-11022-132868635777746/source", 
    "state": "file", 
    "uid": 0
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric collector is configured] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:53
Saturday 17 August 2024  06:23:46 -0400 (0:00:00.730)       0:00:51.058 ******* 
changed: [managed_node1] => {
    "changed": true, 
    "checksum": "7518789c091387cd9c322e1a8fa8aad21d4efbd3", 
    "dest": "/etc/sysconfig/pmcd", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "073cb531c98ecbe1841811dc55975e29", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 1627, 
    "src": "/root/.ansible/tmp/ansible-tmp-1723890226.32-11077-247459617408711/source", 
    "state": "file", 
    "uid": 0
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric collector system accounts are configured] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:60
Saturday 17 August 2024  06:23:46 -0400 (0:00:00.722)       0:00:51.780 ******* 
changed: [managed_node1] => (item={u'sasluser': u'metrics', u'user': u'metrics', u'saslpassword': u'metrics'}) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "comment": "", 
    "create_home": true, 
    "group": 993, 
    "home": "/home/metrics", 
    "item": {
        "saslpassword": "metrics", 
        "sasluser": "metrics", 
        "user": "metrics"
    }, 
    "name": "metrics", 
    "shell": "/bin/bash", 
    "state": "present", 
    "system": true, 
    "uid": 996
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric collector SASL accounts are configured] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:68
Saturday 17 August 2024  06:23:47 -0400 (0:00:00.884)       0:00:52.665 ******* 
ok: [managed_node1] => (item={u'sasluser': u'metrics', u'user': u'metrics', u'saslpassword': u'metrics'}) => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "cmd": "set -eu\nif set -o | grep -q pipefail; then\n  set -o pipefail  # pipefail not supported on debian, some ubuntu\nfi\nif ! sasldblistusers2 -f \"/etc/pcp/passwd.db\" | grep -q \"^metrics@\"; then\n  echo \"Creating new metrics user in /etc/pcp/passwd.db\"\n  echo \"metrics\" | saslpasswd2 -a pmcd \"metrics\"\n  chown root:pcp \"/etc/pcp/passwd.db\"\n  chmod 640 \"/etc/pcp/passwd.db\"\nfi\n", 
    "delta": "0:00:00.149100", 
    "end": "2024-08-17 06:23:48.341445", 
    "item": {
        "saslpassword": "metrics", 
        "sasluser": "metrics", 
        "user": "metrics"
    }, 
    "rc": 0, 
    "start": "2024-08-17 06:23:48.192345"
}

STDOUT:

Creating new metrics user in /etc/pcp/passwd.db


STDERR:

listusers failed

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric collector authentication is configured] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:86
Saturday 17 August 2024  06:23:48 -0400 (0:00:00.634)       0:00:53.300 ******* 
changed: [managed_node1] => {
    "changed": true, 
    "checksum": "615d2de55ab86108da0c7e6b64988fecb4169771", 
    "dest": "/etc/sasl2/pmcd.conf", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "6ec6ea6e2e76889d95da22305316a5fe", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 998, 
    "src": "/root/.ansible/tmp/ansible-tmp-1723890228.56-11218-112180509999317/source", 
    "state": "file", 
    "uid": 0
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Set variable to do pmcd restart if needed] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:94
Saturday 17 August 2024  06:23:49 -0400 (0:00:00.637)       0:00:53.938 ******* 
ok: [managed_node1] => {
    "ansible_facts": {
        "__pcp_restart_pmcd": true
    }, 
    "changed": false
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Report performance metric collector restart state] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:99
Saturday 17 August 2024  06:23:49 -0400 (0:00:00.039)       0:00:53.977 ******* 
ok: [managed_node1] => {}

MSG:

[u'optional_agents: True', u'explicit_labels: True', u'implicit_labels: True', u'defaults_config: True', u'authentication: True', u'restart_pmcd: True']

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric collector is running and enabled on boot] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:110
Saturday 17 August 2024  06:23:49 -0400 (0:00:00.045)       0:00:54.022 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric collector is restarted and enabled on boot] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:117
Saturday 17 August 2024  06:23:49 -0400 (0:00:00.032)       0:00:54.055 ******* 
changed: [managed_node1] => {
    "changed": true, 
    "enabled": true, 
    "name": "pmcd", 
    "state": "started", 
    "status": {
        "ActiveEnterTimestamp": "Sat 2024-08-17 06:23:37 EDT", 
        "ActiveEnterTimestampMonotonic": "318929590", 
        "ActiveExitTimestamp": "Sat 2024-08-17 06:23:36 EDT", 
        "ActiveExitTimestampMonotonic": "317958382", 
        "ActiveState": "active", 
        "After": "network-online.target system.slice basic.target systemd-journald.socket avahi-daemon.service", 
        "AllowIsolate": "no", 
        "AmbientCapabilities": "0", 
        "AssertResult": "yes", 
        "AssertTimestamp": "Sat 2024-08-17 06:23:37 EDT", 
        "AssertTimestampMonotonic": "318697151", 
        "Before": "zabbix-agent.service pmie.service pmlogger.service multi-user.target shutdown.target", 
        "BlockIOAccounting": "no", 
        "BlockIOWeight": "18446744073709551615", 
        "CPUAccounting": "no", 
        "CPUQuotaPerSecUSec": "infinity", 
        "CPUSchedulingPolicy": "0", 
        "CPUSchedulingPriority": "0", 
        "CPUSchedulingResetOnFork": "no", 
        "CPUShares": "18446744073709551615", 
        "CanIsolate": "no", 
        "CanReload": "no", 
        "CanStart": "yes", 
        "CanStop": "yes", 
        "CapabilityBoundingSet": "18446744073709551615", 
        "CollectMode": "inactive", 
        "ConditionResult": "yes", 
        "ConditionTimestamp": "Sat 2024-08-17 06:23:37 EDT", 
        "ConditionTimestampMonotonic": "318697150", 
        "Conflicts": "shutdown.target", 
        "ControlGroup": "/system.slice/pmcd.service", 
        "ControlPID": "0", 
        "DefaultDependencies": "yes", 
        "Delegate": "no", 
        "Description": "Performance Metrics Collector Daemon", 
        "DevicePolicy": "auto", 
        "Documentation": "man:pmcd(8)", 
        "ExecMainCode": "0", 
        "ExecMainExitTimestampMonotonic": "0", 
        "ExecMainPID": "16789", 
        "ExecMainStartTimestamp": "Sat 2024-08-17 06:23:37 EDT", 
        "ExecMainStartTimestampMonotonic": "318929402", 
        "ExecMainStatus": "0", 
        "ExecStart": "{ path=/usr/share/pcp/lib/pmcd ; argv[]=/usr/share/pcp/lib/pmcd start ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", 
        "ExecStop": "{ path=/usr/share/pcp/lib/pmcd ; argv[]=/usr/share/pcp/lib/pmcd stop ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", 
        "FailureAction": "none", 
        "FileDescriptorStoreMax": "0", 
        "FragmentPath": "/usr/lib/systemd/system/pmcd.service", 
        "GuessMainPID": "yes", 
        "IOScheduling": "0", 
        "Id": "pmcd.service", 
        "IgnoreOnIsolate": "no", 
        "IgnoreOnSnapshot": "no", 
        "IgnoreSIGPIPE": "yes", 
        "InactiveEnterTimestamp": "Sat 2024-08-17 06:23:37 EDT", 
        "InactiveEnterTimestampMonotonic": "318695812", 
        "InactiveExitTimestamp": "Sat 2024-08-17 06:23:37 EDT", 
        "InactiveExitTimestampMonotonic": "318697800", 
        "JobTimeoutAction": "none", 
        "JobTimeoutUSec": "0", 
        "KillMode": "control-group", 
        "KillSignal": "15", 
        "LimitAS": "18446744073709551615", 
        "LimitCORE": "18446744073709551615", 
        "LimitCPU": "18446744073709551615", 
        "LimitDATA": "18446744073709551615", 
        "LimitFSIZE": "18446744073709551615", 
        "LimitLOCKS": "18446744073709551615", 
        "LimitMEMLOCK": "65536", 
        "LimitMSGQUEUE": "819200", 
        "LimitNICE": "0", 
        "LimitNOFILE": "4096", 
        "LimitNPROC": "14311", 
        "LimitRSS": "18446744073709551615", 
        "LimitRTPRIO": "0", 
        "LimitRTTIME": "18446744073709551615", 
        "LimitSIGPENDING": "14311", 
        "LimitSTACK": "18446744073709551615", 
        "LoadState": "loaded", 
        "MainPID": "16789", 
        "MemoryAccounting": "no", 
        "MemoryCurrent": "18446744073709551615", 
        "MemoryLimit": "18446744073709551615", 
        "MountFlags": "0", 
        "Names": "pmcd.service", 
        "NeedDaemonReload": "no", 
        "Nice": "0", 
        "NoNewPrivileges": "no", 
        "NonBlocking": "no", 
        "NotifyAccess": "none", 
        "OOMScoreAdjust": "0", 
        "OnFailureJobMode": "replace", 
        "PIDFile": "/run/pcp/pmcd.pid", 
        "PermissionsStartOnly": "no", 
        "PrivateDevices": "no", 
        "PrivateNetwork": "no", 
        "PrivateTmp": "no", 
        "ProtectHome": "no", 
        "ProtectSystem": "no", 
        "RefuseManualStart": "no", 
        "RefuseManualStop": "no", 
        "RemainAfterExit": "no", 
        "Requires": "basic.target system.slice", 
        "Restart": "always", 
        "RestartUSec": "100ms", 
        "Result": "success", 
        "RootDirectoryStartOnly": "no", 
        "RuntimeDirectoryMode": "0755", 
        "SameProcessGroup": "no", 
        "SecureBits": "0", 
        "SendSIGHUP": "no", 
        "SendSIGKILL": "yes", 
        "Slice": "system.slice", 
        "StandardError": "inherit", 
        "StandardInput": "null", 
        "StandardOutput": "journal", 
        "StartLimitAction": "none", 
        "StartLimitBurst": "5", 
        "StartLimitInterval": "10000000", 
        "StartupBlockIOWeight": "18446744073709551615", 
        "StartupCPUShares": "18446744073709551615", 
        "StatusErrno": "0", 
        "StopWhenUnneeded": "no", 
        "SubState": "running", 
        "SyslogLevelPrefix": "yes", 
        "SyslogPriority": "30", 
        "SystemCallErrorNumber": "0", 
        "TTYReset": "no", 
        "TTYVHangup": "no", 
        "TTYVTDisallocate": "no", 
        "TasksAccounting": "no", 
        "TasksCurrent": "18446744073709551615", 
        "TasksMax": "18446744073709551615", 
        "TimeoutStartUSec": "1min 30s", 
        "TimeoutStopUSec": "1min 30s", 
        "TimerSlackNSec": "50000", 
        "Transient": "no", 
        "Type": "forking", 
        "UMask": "0022", 
        "UnitFilePreset": "disabled", 
        "UnitFileState": "enabled", 
        "WantedBy": "multi-user.target", 
        "Wants": "avahi-daemon.service", 
        "WatchdogTimestamp": "Sat 2024-08-17 06:23:37 EDT", 
        "WatchdogTimestampMonotonic": "318929428", 
        "WatchdogUSec": "0"
    }
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Include pmie] ****
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/main.yml:45
Saturday 17 August 2024  06:23:50 -0400 (0:00:01.637)       0:00:55.692 ******* 
included: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml for managed_node1

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure extra performance rule group directories exist] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:4
Saturday 17 August 2024  06:23:51 -0400 (0:00:00.111)       0:00:55.804 ******* 
changed: [managed_node1] => (item=network) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "gid": 0, 
    "group": "root", 
    "item": "network", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/etc/pcp/pmieconf/network", 
    "secontext": "unconfined_u:object_r:etc_t:s0", 
    "size": 4096, 
    "state": "directory", 
    "uid": 0
}
changed: [managed_node1] => (item=power) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "gid": 0, 
    "group": "root", 
    "item": "power", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/etc/pcp/pmieconf/power", 
    "secontext": "unconfined_u:object_r:etc_t:s0", 
    "size": 4096, 
    "state": "directory", 
    "uid": 0
}
changed: [managed_node1] => (item=zeroconf) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "gid": 0, 
    "group": "root", 
    "item": "zeroconf", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/etc/pcp/pmieconf/zeroconf", 
    "secontext": "unconfined_u:object_r:etc_t:s0", 
    "size": 4096, 
    "state": "directory", 
    "uid": 0
}
changed: [managed_node1] => (item=filesys) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "gid": 0, 
    "group": "root", 
    "item": "filesys", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/etc/pcp/pmieconf/filesys", 
    "secontext": "unconfined_u:object_r:etc_t:s0", 
    "size": 4096, 
    "state": "directory", 
    "uid": 0
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure extra performance rule group link directories exist] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:14
Saturday 17 August 2024  06:23:52 -0400 (0:00:01.449)       0:00:57.254 ******* 
changed: [managed_node1] => (item=network) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "gid": 0, 
    "group": "root", 
    "item": "network", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/var/lib/pcp/config/pmieconf/network", 
    "secontext": "unconfined_u:object_r:pcp_var_lib_t:s0", 
    "size": 4096, 
    "state": "directory", 
    "uid": 0
}
changed: [managed_node1] => (item=power) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "gid": 0, 
    "group": "root", 
    "item": "power", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/var/lib/pcp/config/pmieconf/power", 
    "secontext": "unconfined_u:object_r:pcp_var_lib_t:s0", 
    "size": 4096, 
    "state": "directory", 
    "uid": 0
}
ok: [managed_node1] => (item=zeroconf) => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "gid": 0, 
    "group": "root", 
    "item": "zeroconf", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/var/lib/pcp/config/pmieconf/zeroconf", 
    "secontext": "system_u:object_r:pcp_var_lib_t:s0", 
    "size": 4096, 
    "state": "directory", 
    "uid": 0
}
ok: [managed_node1] => (item=filesys) => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "gid": 0, 
    "group": "root", 
    "item": "filesys", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/var/lib/pcp/config/pmieconf/filesys", 
    "secontext": "system_u:object_r:pcp_var_lib_t:s0", 
    "size": 4096, 
    "state": "directory", 
    "uid": 0
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure extra performance rules are installed for targeted hosts] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:24
Saturday 17 August 2024  06:23:53 -0400 (0:00:01.467)       0:00:58.721 ******* 
changed: [managed_node1] => (item=network/tcplistenoverflows) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "checksum": "608d8a6ac6ee33bb86b77d28ba24fbcd378db43d", 
    "dest": "/etc/pcp/pmieconf/network/tcplistenoverflows", 
    "gid": 0, 
    "group": "root", 
    "item": "network/tcplistenoverflows", 
    "md5sum": "315bdffd61351824525f8a1572d604ba", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 971, 
    "src": "/root/.ansible/tmp/ansible-tmp-1723890234.02-11570-5541815922818/source", 
    "state": "file", 
    "uid": 0
}
changed: [managed_node1] => (item=network/tcpqfulldocookies) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "checksum": "3256a5c2e8d07a20d8e97a08c0ab163252b0beae", 
    "dest": "/etc/pcp/pmieconf/network/tcpqfulldocookies", 
    "gid": 0, 
    "group": "root", 
    "item": "network/tcpqfulldocookies", 
    "md5sum": "b062e23db526f63dbb09bc648b03f100", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 1131, 
    "src": "/root/.ansible/tmp/ansible-tmp-1723890234.58-11570-57768041259386/source", 
    "state": "file", 
    "uid": 0
}
changed: [managed_node1] => (item=network/tcpqfulldrops) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "checksum": "37b2bd7f2430bd9678ab078c5e69a53bea556524", 
    "dest": "/etc/pcp/pmieconf/network/tcpqfulldrops", 
    "gid": 0, 
    "group": "root", 
    "item": "network/tcpqfulldrops", 
    "md5sum": "61b36977331d7cc7b98658f3e083e578", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 1129, 
    "src": "/root/.ansible/tmp/ansible-tmp-1723890235.18-11570-178093381611194/source", 
    "state": "file", 
    "uid": 0
}
changed: [managed_node1] => (item=power/thermal_throttle) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "checksum": "1d53d6182709617c8f633339652d8d9e75f3b603", 
    "dest": "/etc/pcp/pmieconf/power/thermal_throttle", 
    "gid": 0, 
    "group": "root", 
    "item": "power/thermal_throttle", 
    "md5sum": "87fa94811e21328fa870b42e4a8ab568", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 1153, 
    "src": "/root/.ansible/tmp/ansible-tmp-1723890235.81-11570-149979697360258/source", 
    "state": "file", 
    "uid": 0
}
changed: [managed_node1] => (item=zeroconf/all_threads) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "checksum": "65169db16dcaa224c211373001adc3addf1031c4", 
    "dest": "/etc/pcp/pmieconf/zeroconf/all_threads", 
    "gid": 0, 
    "group": "root", 
    "item": "zeroconf/all_threads", 
    "md5sum": "6bbe849190f88f2079c32b24f0fcf092", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 840, 
    "src": "/root/.ansible/tmp/ansible-tmp-1723890236.48-11570-273144527338059/source", 
    "state": "file", 
    "uid": 0
}
changed: [managed_node1] => (item=filesys/vfs_files) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "checksum": "cd5d85dfb8eebd7d9737d56e78bd969dafa3999c", 
    "dest": "/etc/pcp/pmieconf/filesys/vfs_files", 
    "gid": 0, 
    "group": "root", 
    "item": "filesys/vfs_files", 
    "md5sum": "ceb10cd4cd3fbbedd52b7d7e45963eb3", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 969, 
    "src": "/root/.ansible/tmp/ansible-tmp-1723890237.16-11570-276357151833513/source", 
    "state": "file", 
    "uid": 0
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance rule actions are installed for targeted hosts] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:34
Saturday 17 August 2024  06:23:57 -0400 (0:00:03.787)       0:01:02.508 ******* 
ok: [managed_node1] => {
    "ansible_facts": {
        "local_pmie": "default"
    }, 
    "changed": false
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Check if global pmie webhook action is configured] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:38
Saturday 17 August 2024  06:23:57 -0400 (0:00:00.029)       0:01:02.538 ******* 
skipping: [managed_node1] => (item=default)  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "default", 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Configure global webhook action] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:50
Saturday 17 August 2024  06:23:57 -0400 (0:00:00.039)       0:01:02.577 ******* 
skipping: [managed_node1] => (item={u'skip_reason': u'Conditional result was False', u'item': u'default', u'skipped': True, u'ansible_loop_var': u'item', u'changed': False})  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": {
        "ansible_loop_var": "item", 
        "changed": false, 
        "item": "default", 
        "skip_reason": "Conditional result was False", 
        "skipped": true
    }, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Check if global webhook endpoint is configured] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:63
Saturday 17 August 2024  06:23:57 -0400 (0:00:00.051)       0:01:02.629 ******* 
ok: [managed_node1] => (item=default) => {
    "ansible_loop_var": "item", 
    "backup": "", 
    "changed": false, 
    "found": 0, 
    "item": "default"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Configure global webhook endpoint] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:73
Saturday 17 August 2024  06:23:58 -0400 (0:00:00.538)       0:01:03.167 ******* 
skipping: [managed_node1] => (item={u'ansible_loop_var': u'item', u'item': u'default', u'changed': False, u'failed': False, u'found': 0, u'invocation': {u'module_args': {u'directory_mode': None, u'force': None, u'remote_src': None, u'selevel': None, u'backrefs': False, u'insertafter': None, u'follow': False, u'owner': None, u'path': u'/var/lib/pcp/config/pmie/config.default', u'line': None, u'validate': None, u'src': None, u'group': None, u'insertbefore': None, u'unsafe_writes': False, u'delimiter': None, u'create': False, u'seuser': None, u'serole': None, u'regexp': u'//.*global webhook_endpoint = ""', u'content': None, u'state': u'absent', u'mode': None, u'firstmatch': False, u'attributes': None, u'backup': False, u'setype': None}}, u'diff': [{u'after': u'', u'after_header': u'/var/lib/pcp/config/pmie/config.default (content)', u'before_header': u'/var/lib/pcp/config/pmie/config.default (content)', u'before': u''}, {u'after_header': u'/var/lib/pcp/config/pmie/config.default (file attributes)', u'before_header': u'/var/lib/pcp/config/pmie/config.default (file attributes)'}], u'backup': u'', u'msg': u''})  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": {
        "ansible_loop_var": "item", 
        "backup": "", 
        "changed": false, 
        "diff": [
            {
                "after": "", 
                "after_header": "/var/lib/pcp/config/pmie/config.default (content)", 
                "before": "", 
                "before_header": "/var/lib/pcp/config/pmie/config.default (content)"
            }, 
            {
                "after_header": "/var/lib/pcp/config/pmie/config.default (file attributes)", 
                "before_header": "/var/lib/pcp/config/pmie/config.default (file attributes)"
            }
        ], 
        "failed": false, 
        "found": 0, 
        "invocation": {
            "module_args": {
                "attributes": null, 
                "backrefs": false, 
                "backup": false, 
                "content": null, 
                "create": false, 
                "delimiter": null, 
                "directory_mode": null, 
                "firstmatch": false, 
                "follow": false, 
                "force": null, 
                "group": null, 
                "insertafter": null, 
                "insertbefore": null, 
                "line": null, 
                "mode": null, 
                "owner": null, 
                "path": "/var/lib/pcp/config/pmie/config.default", 
                "regexp": "//.*global webhook_endpoint = \"\"", 
                "remote_src": null, 
                "selevel": null, 
                "serole": null, 
                "setype": null, 
                "seuser": null, 
                "src": null, 
                "state": "absent", 
                "unsafe_writes": false, 
                "validate": null
            }
        }, 
        "item": "default", 
        "msg": ""
    }, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure extra rules symlinks have been created for targeted hosts] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:86
Saturday 17 August 2024  06:23:58 -0400 (0:00:00.056)       0:01:03.223 ******* 
changed: [managed_node1] => (item=network/tcplistenoverflows) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "dest": "/var/lib/pcp/config/pmieconf/network/tcplistenoverflows", 
    "gid": 0, 
    "group": "root", 
    "item": "network/tcplistenoverflows", 
    "mode": "0777", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:pcp_var_lib_t:s0", 
    "size": 44, 
    "src": "/etc/pcp/pmieconf/network/tcplistenoverflows", 
    "state": "link", 
    "uid": 0
}
changed: [managed_node1] => (item=network/tcpqfulldocookies) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "dest": "/var/lib/pcp/config/pmieconf/network/tcpqfulldocookies", 
    "gid": 0, 
    "group": "root", 
    "item": "network/tcpqfulldocookies", 
    "mode": "0777", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:pcp_var_lib_t:s0", 
    "size": 43, 
    "src": "/etc/pcp/pmieconf/network/tcpqfulldocookies", 
    "state": "link", 
    "uid": 0
}
changed: [managed_node1] => (item=network/tcpqfulldrops) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "dest": "/var/lib/pcp/config/pmieconf/network/tcpqfulldrops", 
    "gid": 0, 
    "group": "root", 
    "item": "network/tcpqfulldrops", 
    "mode": "0777", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:pcp_var_lib_t:s0", 
    "size": 39, 
    "src": "/etc/pcp/pmieconf/network/tcpqfulldrops", 
    "state": "link", 
    "uid": 0
}
changed: [managed_node1] => (item=power/thermal_throttle) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "dest": "/var/lib/pcp/config/pmieconf/power/thermal_throttle", 
    "gid": 0, 
    "group": "root", 
    "item": "power/thermal_throttle", 
    "mode": "0777", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:pcp_var_lib_t:s0", 
    "size": 40, 
    "src": "/etc/pcp/pmieconf/power/thermal_throttle", 
    "state": "link", 
    "uid": 0
}
changed: [managed_node1] => (item=zeroconf/all_threads) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "dest": "/var/lib/pcp/config/pmieconf/zeroconf/all_threads", 
    "gid": 0, 
    "group": "root", 
    "item": "zeroconf/all_threads", 
    "mode": "0777", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:pcp_var_lib_t:s0", 
    "size": 38, 
    "src": "/etc/pcp/pmieconf/zeroconf/all_threads", 
    "state": "link", 
    "uid": 0
}
changed: [managed_node1] => (item=filesys/vfs_files) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "dest": "/var/lib/pcp/config/pmieconf/filesys/vfs_files", 
    "gid": 0, 
    "group": "root", 
    "item": "filesys/vfs_files", 
    "mode": "0777", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:pcp_var_lib_t:s0", 
    "size": 35, 
    "src": "/etc/pcp/pmieconf/filesys/vfs_files", 
    "state": "link", 
    "uid": 0
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Enable performance metric inference for targeted hosts (with control.d)] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:95
Saturday 17 August 2024  06:24:00 -0400 (0:00:01.899)       0:01:05.123 ******* 

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Enable performance metric inference for targeted hosts (single control)] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:106
Saturday 17 August 2024  06:24:00 -0400 (0:00:00.037)       0:01:05.160 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Set variable to do pmie restart if needed] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:116
Saturday 17 August 2024  06:24:00 -0400 (0:00:00.031)       0:01:05.192 ******* 
ok: [managed_node1] => {
    "ansible_facts": {
        "__pcp_restart_pmie": true
    }, 
    "changed": false
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric inference is running and enabled on boot] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:120
Saturday 17 August 2024  06:24:00 -0400 (0:00:00.040)       0:01:05.232 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric inference is restarted and enabled on boot] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:127
Saturday 17 August 2024  06:24:00 -0400 (0:00:00.030)       0:01:05.263 ******* 
changed: [managed_node1] => {
    "changed": true, 
    "enabled": true, 
    "name": "pmie", 
    "state": "started", 
    "status": {
        "ActiveEnterTimestamp": "Sat 2024-08-17 06:23:39 EDT", 
        "ActiveEnterTimestampMonotonic": "320221404", 
        "ActiveExitTimestampMonotonic": "0", 
        "ActiveState": "active", 
        "After": "system.slice network-online.target pmcd.service pmie_check.timer basic.target systemd-journald.socket pmie_daily.timer", 
        "AllowIsolate": "no", 
        "AmbientCapabilities": "0", 
        "AssertResult": "yes", 
        "AssertTimestamp": "Sat 2024-08-17 06:23:38 EDT", 
        "AssertTimestampMonotonic": "319838224", 
        "Before": "shutdown.target multi-user.target", 
        "BindsTo": "pmie_check.timer pmie_daily.timer", 
        "BlockIOAccounting": "no", 
        "BlockIOWeight": "18446744073709551615", 
        "CPUAccounting": "no", 
        "CPUQuotaPerSecUSec": "infinity", 
        "CPUSchedulingPolicy": "0", 
        "CPUSchedulingPriority": "0", 
        "CPUSchedulingResetOnFork": "no", 
        "CPUShares": "18446744073709551615", 
        "CanIsolate": "no", 
        "CanReload": "no", 
        "CanStart": "yes", 
        "CanStop": "yes", 
        "CapabilityBoundingSet": "18446744073709551615", 
        "CollectMode": "inactive", 
        "ConditionResult": "yes", 
        "ConditionTimestamp": "Sat 2024-08-17 06:23:38 EDT", 
        "ConditionTimestampMonotonic": "319838223", 
        "Conflicts": "shutdown.target", 
        "ControlGroup": "/system.slice/pmie.service", 
        "ControlPID": "0", 
        "DefaultDependencies": "yes", 
        "Delegate": "no", 
        "Description": "Performance Metrics Inference Engine", 
        "DevicePolicy": "auto", 
        "Documentation": "man:pmie(1)", 
        "ExecMainCode": "0", 
        "ExecMainExitTimestampMonotonic": "0", 
        "ExecMainPID": "17666", 
        "ExecMainStartTimestamp": "Sat 2024-08-17 06:23:39 EDT", 
        "ExecMainStartTimestampMonotonic": "320221322", 
        "ExecMainStatus": "0", 
        "ExecStart": "{ path=/usr/share/pcp/lib/pmie ; argv[]=/usr/share/pcp/lib/pmie start ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", 
        "ExecStop": "{ path=/usr/share/pcp/lib/pmie ; argv[]=/usr/share/pcp/lib/pmie stop ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", 
        "FailureAction": "none", 
        "FileDescriptorStoreMax": "0", 
        "FragmentPath": "/usr/lib/systemd/system/pmie.service", 
        "GuessMainPID": "yes", 
        "IOScheduling": "0", 
        "Id": "pmie.service", 
        "IgnoreOnIsolate": "no", 
        "IgnoreOnSnapshot": "no", 
        "IgnoreSIGPIPE": "yes", 
        "InactiveEnterTimestampMonotonic": "0", 
        "InactiveExitTimestamp": "Sat 2024-08-17 06:23:38 EDT", 
        "InactiveExitTimestampMonotonic": "319838891", 
        "JobTimeoutAction": "none", 
        "JobTimeoutUSec": "0", 
        "KillMode": "control-group", 
        "KillSignal": "15", 
        "LimitAS": "18446744073709551615", 
        "LimitCORE": "18446744073709551615", 
        "LimitCPU": "18446744073709551615", 
        "LimitDATA": "18446744073709551615", 
        "LimitFSIZE": "18446744073709551615", 
        "LimitLOCKS": "18446744073709551615", 
        "LimitMEMLOCK": "65536", 
        "LimitMSGQUEUE": "819200", 
        "LimitNICE": "0", 
        "LimitNOFILE": "4096", 
        "LimitNPROC": "14311", 
        "LimitRSS": "18446744073709551615", 
        "LimitRTPRIO": "0", 
        "LimitRTTIME": "18446744073709551615", 
        "LimitSIGPENDING": "14311", 
        "LimitSTACK": "18446744073709551615", 
        "LoadState": "loaded", 
        "MainPID": "17666", 
        "MemoryAccounting": "no", 
        "MemoryCurrent": "18446744073709551615", 
        "MemoryLimit": "18446744073709551615", 
        "MountFlags": "0", 
        "Names": "pmie.service", 
        "NeedDaemonReload": "no", 
        "Nice": "0", 
        "NoNewPrivileges": "no", 
        "NonBlocking": "no", 
        "NotifyAccess": "none", 
        "OOMScoreAdjust": "0", 
        "OnFailureJobMode": "replace", 
        "PIDFile": "/run/pcp/pmie.pid", 
        "PermissionsStartOnly": "no", 
        "PrivateDevices": "no", 
        "PrivateNetwork": "no", 
        "PrivateTmp": "no", 
        "ProtectHome": "no", 
        "ProtectSystem": "no", 
        "RefuseManualStart": "no", 
        "RefuseManualStop": "no", 
        "RemainAfterExit": "no", 
        "Requires": "basic.target system.slice", 
        "Restart": "always", 
        "RestartUSec": "100ms", 
        "Result": "success", 
        "RootDirectoryStartOnly": "no", 
        "RuntimeDirectoryMode": "0755", 
        "SameProcessGroup": "no", 
        "SecureBits": "0", 
        "SendSIGHUP": "no", 
        "SendSIGKILL": "yes", 
        "Slice": "system.slice", 
        "StandardError": "inherit", 
        "StandardInput": "null", 
        "StandardOutput": "journal", 
        "StartLimitAction": "none", 
        "StartLimitBurst": "5", 
        "StartLimitInterval": "10000000", 
        "StartupBlockIOWeight": "18446744073709551615", 
        "StartupCPUShares": "18446744073709551615", 
        "StatusErrno": "0", 
        "StopWhenUnneeded": "no", 
        "SubState": "running", 
        "SyslogLevelPrefix": "yes", 
        "SyslogPriority": "30", 
        "SystemCallErrorNumber": "0", 
        "TTYReset": "no", 
        "TTYVHangup": "no", 
        "TTYVTDisallocate": "no", 
        "TasksAccounting": "no", 
        "TasksCurrent": "18446744073709551615", 
        "TasksMax": "18446744073709551615", 
        "TimeoutStartUSec": "1min 30s", 
        "TimeoutStopUSec": "1min 30s", 
        "TimerSlackNSec": "50000", 
        "Transient": "no", 
        "Type": "forking", 
        "UMask": "0022", 
        "UnitFilePreset": "disabled", 
        "UnitFileState": "enabled", 
        "WantedBy": "multi-user.target", 
        "WatchdogTimestampMonotonic": "0", 
        "WatchdogUSec": "0"
    }
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Include pmlogger] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/main.yml:48
Saturday 17 August 2024  06:24:01 -0400 (0:00:00.911)       0:01:06.174 ******* 
included: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmlogger.yml for managed_node1

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure metric log location is configured] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmlogger.yml:4
Saturday 17 August 2024  06:24:01 -0400 (0:00:00.117)       0:01:06.292 ******* 
NOTIFIED HANDLER fedora.linux_system_roles.private_metrics_subrole_pcp : Restart pmproxy for managed_node1
changed: [managed_node1] => {
    "backup": "", 
    "changed": true
}

MSG:

line added

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric logging is configured] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmlogger.yml:12
Saturday 17 August 2024  06:24:01 -0400 (0:00:00.357)       0:01:06.650 ******* 
changed: [managed_node1] => {
    "changed": true, 
    "checksum": "67bc35973101c614e92b1990f8bebfffc39fe498", 
    "dest": "/etc/sysconfig/pmlogger", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "be3c2b929f8c3f822c4ee10fb7b5053d", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 1180, 
    "src": "/root/.ansible/tmp/ansible-tmp-1723890241.92-12053-103893669919133/source", 
    "state": "file", 
    "uid": 0
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric logging retention period is set] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmlogger.yml:19
Saturday 17 August 2024  06:24:02 -0400 (0:00:00.618)       0:01:07.269 ******* 
NOTIFIED HANDLER fedora.linux_system_roles.private_metrics_subrole_pcp : Restart pmlogger for managed_node1
changed: [managed_node1] => {
    "changed": true, 
    "checksum": "df7bd3b5b6f1de3af164aab81441c7251a13a298", 
    "dest": "/etc/sysconfig/pmlogger_timers", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "97c2cdbde792aba7ff0c922c6e06e10b", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 988, 
    "src": "/root/.ansible/tmp/ansible-tmp-1723890242.53-12076-213554268542040/source", 
    "state": "file", 
    "uid": 0
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Enable performance metric logging for targeted hosts (with control.d)] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmlogger.yml:27
Saturday 17 August 2024  06:24:03 -0400 (0:00:00.571)       0:01:07.841 ******* 

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Enable performance metric logging for targeted hosts (single control)] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmlogger.yml:39
Saturday 17 August 2024  06:24:03 -0400 (0:00:00.028)       0:01:07.869 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Set variable to do pmlogger restart if needed] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmlogger.yml:49
Saturday 17 August 2024  06:24:03 -0400 (0:00:00.030)       0:01:07.899 ******* 
ok: [managed_node1] => {
    "ansible_facts": {
        "__pcp_restart_pmlogger": true
    }, 
    "changed": false
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric logging is running and enabled on boot] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmlogger.yml:53
Saturday 17 August 2024  06:24:03 -0400 (0:00:00.034)       0:01:07.934 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric logging is restarted and enabled on boot] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmlogger.yml:60
Saturday 17 August 2024  06:24:03 -0400 (0:00:00.030)       0:01:07.964 ******* 
changed: [managed_node1] => {
    "changed": true, 
    "enabled": true, 
    "name": "pmlogger", 
    "state": "started", 
    "status": {
        "ActiveEnterTimestamp": "Sat 2024-08-17 06:23:38 EDT", 
        "ActiveEnterTimestampMonotonic": "319826026", 
        "ActiveExitTimestampMonotonic": "0", 
        "ActiveState": "active", 
        "After": "pmlogger_daily-poll.timer pmlogger_check.timer systemd-journald.socket system.slice basic.target network-online.target pmcd.service pmlogger_daily.timer", 
        "AllowIsolate": "no", 
        "AmbientCapabilities": "0", 
        "AssertResult": "yes", 
        "AssertTimestamp": "Sat 2024-08-17 06:23:31 EDT", 
        "AssertTimestampMonotonic": "313086337", 
        "Before": "shutdown.target multi-user.target", 
        "BindsTo": "pmlogger_daily.timer pmlogger_daily-poll.timer pmlogger_check.timer", 
        "BlockIOAccounting": "no", 
        "BlockIOWeight": "18446744073709551615", 
        "CPUAccounting": "no", 
        "CPUQuotaPerSecUSec": "infinity", 
        "CPUSchedulingPolicy": "0", 
        "CPUSchedulingPriority": "0", 
        "CPUSchedulingResetOnFork": "no", 
        "CPUShares": "18446744073709551615", 
        "CanIsolate": "no", 
        "CanReload": "no", 
        "CanStart": "yes", 
        "CanStop": "yes", 
        "CapabilityBoundingSet": "18446744073709551615", 
        "CollectMode": "inactive", 
        "ConditionResult": "yes", 
        "ConditionTimestamp": "Sat 2024-08-17 06:23:31 EDT", 
        "ConditionTimestampMonotonic": "313086336", 
        "Conflicts": "shutdown.target", 
        "ControlGroup": "/system.slice/pmlogger.service", 
        "ControlPID": "0", 
        "DefaultDependencies": "yes", 
        "Delegate": "no", 
        "Description": "Performance Metrics Archive Logger", 
        "DevicePolicy": "auto", 
        "Documentation": "man:pmlogger(1)", 
        "ExecMainCode": "0", 
        "ExecMainExitTimestampMonotonic": "0", 
        "ExecMainPID": "17326", 
        "ExecMainStartTimestamp": "Sat 2024-08-17 06:23:38 EDT", 
        "ExecMainStartTimestampMonotonic": "319825902", 
        "ExecMainStatus": "0", 
        "ExecStart": "{ path=/usr/share/pcp/lib/pmlogger ; argv[]=/usr/share/pcp/lib/pmlogger start ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", 
        "ExecStop": "{ path=/usr/share/pcp/lib/pmlogger ; argv[]=/usr/share/pcp/lib/pmlogger stop ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", 
        "FailureAction": "none", 
        "FileDescriptorStoreMax": "0", 
        "FragmentPath": "/usr/lib/systemd/system/pmlogger.service", 
        "GuessMainPID": "yes", 
        "IOScheduling": "0", 
        "Id": "pmlogger.service", 
        "IgnoreOnIsolate": "no", 
        "IgnoreOnSnapshot": "no", 
        "IgnoreSIGPIPE": "yes", 
        "InactiveEnterTimestampMonotonic": "0", 
        "InactiveExitTimestamp": "Sat 2024-08-17 06:23:31 EDT", 
        "InactiveExitTimestampMonotonic": "313086941", 
        "JobTimeoutAction": "none", 
        "JobTimeoutUSec": "0", 
        "KillMode": "control-group", 
        "KillSignal": "15", 
        "LimitAS": "18446744073709551615", 
        "LimitCORE": "18446744073709551615", 
        "LimitCPU": "18446744073709551615", 
        "LimitDATA": "18446744073709551615", 
        "LimitFSIZE": "18446744073709551615", 
        "LimitLOCKS": "18446744073709551615", 
        "LimitMEMLOCK": "65536", 
        "LimitMSGQUEUE": "819200", 
        "LimitNICE": "0", 
        "LimitNOFILE": "4096", 
        "LimitNPROC": "14311", 
        "LimitRSS": "18446744073709551615", 
        "LimitRTPRIO": "0", 
        "LimitRTTIME": "18446744073709551615", 
        "LimitSIGPENDING": "14311", 
        "LimitSTACK": "18446744073709551615", 
        "LoadState": "loaded", 
        "MainPID": "17326", 
        "MemoryAccounting": "no", 
        "MemoryCurrent": "18446744073709551615", 
        "MemoryLimit": "18446744073709551615", 
        "MountFlags": "0", 
        "Names": "pmlogger.service", 
        "NeedDaemonReload": "no", 
        "Nice": "0", 
        "NoNewPrivileges": "no", 
        "NonBlocking": "no", 
        "NotifyAccess": "none", 
        "OOMScoreAdjust": "0", 
        "OnFailureJobMode": "replace", 
        "PIDFile": "/run/pcp/pmlogger.pid", 
        "PermissionsStartOnly": "no", 
        "PrivateDevices": "no", 
        "PrivateNetwork": "no", 
        "PrivateTmp": "no", 
        "ProtectHome": "no", 
        "ProtectSystem": "no", 
        "RefuseManualStart": "no", 
        "RefuseManualStop": "no", 
        "RemainAfterExit": "no", 
        "Requires": "basic.target system.slice", 
        "Restart": "always", 
        "RestartUSec": "100ms", 
        "Result": "success", 
        "RootDirectoryStartOnly": "no", 
        "RuntimeDirectoryMode": "0755", 
        "SameProcessGroup": "no", 
        "SecureBits": "0", 
        "SendSIGHUP": "no", 
        "SendSIGKILL": "yes", 
        "Slice": "system.slice", 
        "StandardError": "inherit", 
        "StandardInput": "null", 
        "StandardOutput": "journal", 
        "StartLimitAction": "none", 
        "StartLimitBurst": "5", 
        "StartLimitInterval": "10000000", 
        "StartupBlockIOWeight": "18446744073709551615", 
        "StartupCPUShares": "18446744073709551615", 
        "StatusErrno": "0", 
        "StopWhenUnneeded": "no", 
        "SubState": "running", 
        "SyslogLevelPrefix": "yes", 
        "SyslogPriority": "30", 
        "SystemCallErrorNumber": "0", 
        "TTYReset": "no", 
        "TTYVHangup": "no", 
        "TTYVTDisallocate": "no", 
        "TasksAccounting": "no", 
        "TasksCurrent": "18446744073709551615", 
        "TasksMax": "18446744073709551615", 
        "TimeoutStartUSec": "1min 30s", 
        "TimeoutStopUSec": "1min 30s", 
        "TimerSlackNSec": "50000", 
        "Transient": "no", 
        "Type": "forking", 
        "UMask": "0022", 
        "UnitFilePreset": "disabled", 
        "UnitFileState": "enabled", 
        "WantedBy": "multi-user.target", 
        "WatchdogTimestampMonotonic": "0", 
        "WatchdogUSec": "0"
    }
}

TASK [fedora.linux_system_roles.private_metrics_subrole_pcp : Include pmproxy] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/main.yml:51
Saturday 17 August 2024  06:24:09 -0400 (0:00:06.201)       0:01:14.166 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [Setup metric graphing service.] ******************************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/main.yml:96
Saturday 17 August 2024  06:24:09 -0400 (0:00:00.030)       0:01:14.197 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.metrics : Configure firewall] ******************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/main.yml:104
Saturday 17 August 2024  06:24:09 -0400 (0:00:00.028)       0:01:14.225 ******* 
included: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/firewall.yml for managed_node1

TASK [fedora.linux_system_roles.metrics : Initialize __metrics_firewall] *******
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/firewall.yml:9
Saturday 17 August 2024  06:24:09 -0400 (0:00:00.071)       0:01:14.297 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.metrics : Port for pmcd] ***********************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/firewall.yml:13
Saturday 17 August 2024  06:24:09 -0400 (0:00:00.037)       0:01:14.334 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.metrics : Port for pmproxy used by query and grafana] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/firewall.yml:19
Saturday 17 August 2024  06:24:09 -0400 (0:00:00.036)       0:01:14.371 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.metrics : Service for grafana] *****************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/firewall.yml:25
Saturday 17 August 2024  06:24:09 -0400 (0:00:00.045)       0:01:14.417 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.metrics : Service for redis] *******************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/firewall.yml:31
Saturday 17 August 2024  06:24:09 -0400 (0:00:00.038)       0:01:14.455 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [Ensure the service and the port status with the firewall role] ***********
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/firewall.yml:37
Saturday 17 August 2024  06:24:09 -0400 (0:00:00.035)       0:01:14.490 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.metrics : Configure selinux] *******************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/main.yml:107
Saturday 17 August 2024  06:24:09 -0400 (0:00:00.039)       0:01:14.530 ******* 
included: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/selinux.yml for managed_node1

TASK [fedora.linux_system_roles.metrics : Set pcp_bind_all_unreserved_ports] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/selinux.yml:6
Saturday 17 August 2024  06:24:09 -0400 (0:00:00.075)       0:01:14.606 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [Ensure the port status with the selinux role] ****************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/metrics/tasks/selinux.yml:11
Saturday 17 August 2024  06:24:09 -0400 (0:00:00.029)       0:01:14.635 ******* 
skipping: [managed_node1] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}

TASK [Check if import from Elasticsearch works] ********************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/tests_verify_from_elasticsearch.yml:30
Saturday 17 August 2024  06:24:09 -0400 (0:00:00.028)       0:01:14.664 ******* 
included: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/check_from_elasticsearch.yml for managed_node1

TASK [Check if elasticsearch pmda is registered] *******************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/check_from_elasticsearch.yml:3
Saturday 17 August 2024  06:24:09 -0400 (0:00:00.046)       0:01:14.711 ******* 
ok: [managed_node1] => {
    "attempts": 1, 
    "changed": false, 
    "cmd": [
        "pmprobe", 
        "-I", 
        "pmcd.agent.status"
    ], 
    "delta": "0:00:00.007424", 
    "end": "2024-08-17 06:24:10.158448", 
    "rc": 0, 
    "start": "2024-08-17 06:24:10.151024"
}

STDOUT:

pmcd.agent.status 12 "root" "pmcd" "proc" "pmproxy" "xfs" "linux" "nfsclient" "mmv" "kvm" "elasticsearch" "jbd2" "dm"

TASK [Set platform/version specific variables] *********************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/check_from_elasticsearch.yml:12
Saturday 17 August 2024  06:24:10 -0400 (0:00:00.297)       0:01:15.009 ******* 
skipping: [managed_node1] => (item=roles/linux-system-roles.metrics/roles/elasticsearch/vars/default.yml")  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "roles/linux-system-roles.metrics/roles/elasticsearch/vars/default.yml\"", 
    "skip_reason": "Conditional result was False"
}
skipping: [managed_node1] => (item=roles/linux-system-roles.metrics/roles/elasticsearch/vars/RedHat.yml")  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "roles/linux-system-roles.metrics/roles/elasticsearch/vars/RedHat.yml\"", 
    "skip_reason": "Conditional result was False"
}
skipping: [managed_node1] => (item=roles/linux-system-roles.metrics/roles/elasticsearch/vars/CentOS.yml")  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "roles/linux-system-roles.metrics/roles/elasticsearch/vars/CentOS.yml\"", 
    "skip_reason": "Conditional result was False"
}
skipping: [managed_node1] => (item=roles/linux-system-roles.metrics/roles/elasticsearch/vars/CentOS_7.yml")  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "roles/linux-system-roles.metrics/roles/elasticsearch/vars/CentOS_7.yml\"", 
    "skip_reason": "Conditional result was False"
}
skipping: [managed_node1] => (item=roles/linux-system-roles.metrics/roles/elasticsearch/vars/CentOS_7.9.yml")  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "roles/linux-system-roles.metrics/roles/elasticsearch/vars/CentOS_7.9.yml\"", 
    "skip_reason": "Conditional result was False"
}

TASK [Check the ansible_managed header in the configuration file] **************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/check_from_elasticsearch.yml:23
Saturday 17 August 2024  06:24:10 -0400 (0:00:00.067)       0:01:15.077 ******* 
included: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/check_header.yml for managed_node1

TASK [Grep the ansible_managed header in /var/lib/pcp/pmdas/elasticsearch/elasticsearch.conf] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/check_header.yml:3
Saturday 17 August 2024  06:24:10 -0400 (0:00:00.043)       0:01:15.120 ******* 
ok: [managed_node1] => {
    "changed": false, 
    "cmd": [
        "grep", 
        "^# Ansible managed", 
        "/var/lib/pcp/pmdas/elasticsearch/elasticsearch.conf"
    ], 
    "delta": "0:00:00.003103", 
    "end": "2024-08-17 06:24:10.571055", 
    "rc": 0, 
    "start": "2024-08-17 06:24:10.567952"
}

STDOUT:

# Ansible managed

RUNNING HANDLER [fedora.linux_system_roles.private_metrics_subrole_pcp : Restart pmproxy] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/handlers/main.yml:14
Saturday 17 August 2024  06:24:10 -0400 (0:00:00.340)       0:01:15.460 ******* 
changed: [managed_node1] => {
    "changed": true, 
    "name": "pmproxy", 
    "state": "started", 
    "status": {
        "ActiveEnterTimestampMonotonic": "0", 
        "ActiveExitTimestampMonotonic": "0", 
        "ActiveState": "inactive", 
        "After": "basic.target network-online.target system.slice pmcd.service systemd-journald.socket", 
        "AllowIsolate": "no", 
        "AmbientCapabilities": "0", 
        "AssertResult": "no", 
        "AssertTimestampMonotonic": "0", 
        "Before": "shutdown.target", 
        "BlockIOAccounting": "no", 
        "BlockIOWeight": "18446744073709551615", 
        "CPUAccounting": "no", 
        "CPUQuotaPerSecUSec": "infinity", 
        "CPUSchedulingPolicy": "0", 
        "CPUSchedulingPriority": "0", 
        "CPUSchedulingResetOnFork": "no", 
        "CPUShares": "18446744073709551615", 
        "CanIsolate": "no", 
        "CanReload": "no", 
        "CanStart": "yes", 
        "CanStop": "yes", 
        "CapabilityBoundingSet": "18446744073709551615", 
        "CollectMode": "inactive", 
        "ConditionResult": "no", 
        "ConditionTimestampMonotonic": "0", 
        "Conflicts": "shutdown.target", 
        "ControlPID": "0", 
        "DefaultDependencies": "yes", 
        "Delegate": "no", 
        "Description": "Proxy for Performance Metrics Collector Daemon", 
        "DevicePolicy": "auto", 
        "Documentation": "man:pmproxy(8)", 
        "ExecMainCode": "0", 
        "ExecMainExitTimestampMonotonic": "0", 
        "ExecMainPID": "0", 
        "ExecMainStartTimestampMonotonic": "0", 
        "ExecMainStatus": "0", 
        "ExecStart": "{ path=/usr/share/pcp/lib/pmproxy ; argv[]=/usr/share/pcp/lib/pmproxy start ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", 
        "ExecStop": "{ path=/usr/share/pcp/lib/pmproxy ; argv[]=/usr/share/pcp/lib/pmproxy stop ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", 
        "FailureAction": "none", 
        "FileDescriptorStoreMax": "0", 
        "FragmentPath": "/usr/lib/systemd/system/pmproxy.service", 
        "GuessMainPID": "yes", 
        "IOScheduling": "0", 
        "Id": "pmproxy.service", 
        "IgnoreOnIsolate": "no", 
        "IgnoreOnSnapshot": "no", 
        "IgnoreSIGPIPE": "yes", 
        "InactiveEnterTimestampMonotonic": "0", 
        "InactiveExitTimestampMonotonic": "0", 
        "JobTimeoutAction": "none", 
        "JobTimeoutUSec": "0", 
        "KillMode": "control-group", 
        "KillSignal": "15", 
        "LimitAS": "18446744073709551615", 
        "LimitCORE": "18446744073709551615", 
        "LimitCPU": "18446744073709551615", 
        "LimitDATA": "18446744073709551615", 
        "LimitFSIZE": "18446744073709551615", 
        "LimitLOCKS": "18446744073709551615", 
        "LimitMEMLOCK": "65536", 
        "LimitMSGQUEUE": "819200", 
        "LimitNICE": "0", 
        "LimitNOFILE": "4096", 
        "LimitNPROC": "14311", 
        "LimitRSS": "18446744073709551615", 
        "LimitRTPRIO": "0", 
        "LimitRTTIME": "18446744073709551615", 
        "LimitSIGPENDING": "14311", 
        "LimitSTACK": "18446744073709551615", 
        "LoadState": "loaded", 
        "MainPID": "0", 
        "MemoryAccounting": "no", 
        "MemoryCurrent": "18446744073709551615", 
        "MemoryLimit": "18446744073709551615", 
        "MountFlags": "0", 
        "Names": "pmproxy.service", 
        "NeedDaemonReload": "no", 
        "Nice": "0", 
        "NoNewPrivileges": "no", 
        "NonBlocking": "no", 
        "NotifyAccess": "none", 
        "OOMScoreAdjust": "0", 
        "OnFailureJobMode": "replace", 
        "PIDFile": "/run/pcp/pmproxy.pid", 
        "PermissionsStartOnly": "no", 
        "PrivateDevices": "no", 
        "PrivateNetwork": "no", 
        "PrivateTmp": "no", 
        "ProtectHome": "no", 
        "ProtectSystem": "no", 
        "RefuseManualStart": "no", 
        "RefuseManualStop": "no", 
        "RemainAfterExit": "no", 
        "Requires": "basic.target system.slice", 
        "Restart": "always", 
        "RestartUSec": "100ms", 
        "Result": "success", 
        "RootDirectoryStartOnly": "no", 
        "RuntimeDirectoryMode": "0755", 
        "SameProcessGroup": "no", 
        "SecureBits": "0", 
        "SendSIGHUP": "no", 
        "SendSIGKILL": "yes", 
        "Slice": "system.slice", 
        "StandardError": "inherit", 
        "StandardInput": "null", 
        "StandardOutput": "journal", 
        "StartLimitAction": "none", 
        "StartLimitBurst": "5", 
        "StartLimitInterval": "10000000", 
        "StartupBlockIOWeight": "18446744073709551615", 
        "StartupCPUShares": "18446744073709551615", 
        "StatusErrno": "0", 
        "StopWhenUnneeded": "no", 
        "SubState": "dead", 
        "SyslogLevelPrefix": "yes", 
        "SyslogPriority": "30", 
        "SystemCallErrorNumber": "0", 
        "TTYReset": "no", 
        "TTYVHangup": "no", 
        "TTYVTDisallocate": "no", 
        "TasksAccounting": "no", 
        "TasksCurrent": "18446744073709551615", 
        "TasksMax": "18446744073709551615", 
        "TimeoutStartUSec": "1min 30s", 
        "TimeoutStopUSec": "1min 30s", 
        "TimerSlackNSec": "50000", 
        "Transient": "no", 
        "Type": "forking", 
        "UMask": "0022", 
        "UnitFilePreset": "disabled", 
        "UnitFileState": "disabled", 
        "WatchdogTimestampMonotonic": "0", 
        "WatchdogUSec": "0"
    }
}

RUNNING HANDLER [fedora.linux_system_roles.private_metrics_subrole_pcp : Restart pmlogger] ***
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/handlers/main.yml:19
Saturday 17 August 2024  06:24:11 -0400 (0:00:00.539)       0:01:16.000 ******* 
changed: [managed_node1] => {
    "changed": true, 
    "name": "pmlogger", 
    "state": "started", 
    "status": {
        "ActiveEnterTimestamp": "Sat 2024-08-17 06:24:09 EDT", 
        "ActiveEnterTimestampMonotonic": "350429666", 
        "ActiveExitTimestamp": "Sat 2024-08-17 06:24:03 EDT", 
        "ActiveExitTimestampMonotonic": "344637469", 
        "ActiveState": "active", 
        "After": "pmlogger_daily-poll.timer pmlogger_check.timer systemd-journald.socket system.slice basic.target network-online.target pmcd.service pmlogger_daily.timer", 
        "AllowIsolate": "no", 
        "AmbientCapabilities": "0", 
        "AssertResult": "yes", 
        "AssertTimestamp": "Sat 2024-08-17 06:24:03 EDT", 
        "AssertTimestampMonotonic": "344709433", 
        "Before": "shutdown.target multi-user.target", 
        "BindsTo": "pmlogger_daily.timer pmlogger_daily-poll.timer pmlogger_check.timer", 
        "BlockIOAccounting": "no", 
        "BlockIOWeight": "18446744073709551615", 
        "CPUAccounting": "no", 
        "CPUQuotaPerSecUSec": "infinity", 
        "CPUSchedulingPolicy": "0", 
        "CPUSchedulingPriority": "0", 
        "CPUSchedulingResetOnFork": "no", 
        "CPUShares": "18446744073709551615", 
        "CanIsolate": "no", 
        "CanReload": "no", 
        "CanStart": "yes", 
        "CanStop": "yes", 
        "CapabilityBoundingSet": "18446744073709551615", 
        "CollectMode": "inactive", 
        "ConditionResult": "yes", 
        "ConditionTimestamp": "Sat 2024-08-17 06:24:03 EDT", 
        "ConditionTimestampMonotonic": "344709432", 
        "Conflicts": "shutdown.target", 
        "ControlGroup": "/system.slice/pmlogger.service", 
        "ControlPID": "0", 
        "DefaultDependencies": "yes", 
        "Delegate": "no", 
        "Description": "Performance Metrics Archive Logger", 
        "DevicePolicy": "auto", 
        "Documentation": "man:pmlogger(1)", 
        "ExecMainCode": "0", 
        "ExecMainExitTimestampMonotonic": "0", 
        "ExecMainPID": "27402", 
        "ExecMainStartTimestamp": "Sat 2024-08-17 06:24:09 EDT", 
        "ExecMainStartTimestampMonotonic": "350429337", 
        "ExecMainStatus": "0", 
        "ExecStart": "{ path=/usr/share/pcp/lib/pmlogger ; argv[]=/usr/share/pcp/lib/pmlogger start ; ignore_errors=no ; start_time=[Sat 2024-08-17 06:24:03 EDT] ; stop_time=[Sat 2024-08-17 06:24:03 EDT] ; pid=21745 ; code=exited ; status=0 }", 
        "ExecStop": "{ path=/usr/share/pcp/lib/pmlogger ; argv[]=/usr/share/pcp/lib/pmlogger stop ; ignore_errors=no ; start_time=[Sat 2024-08-17 06:24:03 EDT] ; stop_time=[Sat 2024-08-17 06:24:03 EDT] ; pid=21685 ; code=exited ; status=0 }", 
        "FailureAction": "none", 
        "FileDescriptorStoreMax": "0", 
        "FragmentPath": "/usr/lib/systemd/system/pmlogger.service", 
        "GuessMainPID": "yes", 
        "IOScheduling": "0", 
        "Id": "pmlogger.service", 
        "IgnoreOnIsolate": "no", 
        "IgnoreOnSnapshot": "no", 
        "IgnoreSIGPIPE": "yes", 
        "InactiveEnterTimestamp": "Sat 2024-08-17 06:24:03 EDT", 
        "InactiveEnterTimestampMonotonic": "344708555", 
        "InactiveExitTimestamp": "Sat 2024-08-17 06:24:03 EDT", 
        "InactiveExitTimestampMonotonic": "344709790", 
        "JobTimeoutAction": "none", 
        "JobTimeoutUSec": "0", 
        "KillMode": "control-group", 
        "KillSignal": "15", 
        "LimitAS": "18446744073709551615", 
        "LimitCORE": "18446744073709551615", 
        "LimitCPU": "18446744073709551615", 
        "LimitDATA": "18446744073709551615", 
        "LimitFSIZE": "18446744073709551615", 
        "LimitLOCKS": "18446744073709551615", 
        "LimitMEMLOCK": "65536", 
        "LimitMSGQUEUE": "819200", 
        "LimitNICE": "0", 
        "LimitNOFILE": "4096", 
        "LimitNPROC": "14311", 
        "LimitRSS": "18446744073709551615", 
        "LimitRTPRIO": "0", 
        "LimitRTTIME": "18446744073709551615", 
        "LimitSIGPENDING": "14311", 
        "LimitSTACK": "18446744073709551615", 
        "LoadState": "loaded", 
        "MainPID": "27402", 
        "MemoryAccounting": "no", 
        "MemoryCurrent": "18446744073709551615", 
        "MemoryLimit": "18446744073709551615", 
        "MountFlags": "0", 
        "Names": "pmlogger.service", 
        "NeedDaemonReload": "no", 
        "Nice": "0", 
        "NoNewPrivileges": "no", 
        "NonBlocking": "no", 
        "NotifyAccess": "none", 
        "OOMScoreAdjust": "0", 
        "OnFailureJobMode": "replace", 
        "PIDFile": "/run/pcp/pmlogger.pid", 
        "PermissionsStartOnly": "no", 
        "PrivateDevices": "no", 
        "PrivateNetwork": "no", 
        "PrivateTmp": "no", 
        "ProtectHome": "no", 
        "ProtectSystem": "no", 
        "RefuseManualStart": "no", 
        "RefuseManualStop": "no", 
        "RemainAfterExit": "no", 
        "Requires": "basic.target system.slice", 
        "Restart": "always", 
        "RestartUSec": "100ms", 
        "Result": "success", 
        "RootDirectoryStartOnly": "no", 
        "RuntimeDirectoryMode": "0755", 
        "SameProcessGroup": "no", 
        "SecureBits": "0", 
        "SendSIGHUP": "no", 
        "SendSIGKILL": "yes", 
        "Slice": "system.slice", 
        "StandardError": "inherit", 
        "StandardInput": "null", 
        "StandardOutput": "journal", 
        "StartLimitAction": "none", 
        "StartLimitBurst": "5", 
        "StartLimitInterval": "10000000", 
        "StartupBlockIOWeight": "18446744073709551615", 
        "StartupCPUShares": "18446744073709551615", 
        "StatusErrno": "0", 
        "StopWhenUnneeded": "no", 
        "SubState": "running", 
        "SyslogLevelPrefix": "yes", 
        "SyslogPriority": "30", 
        "SystemCallErrorNumber": "0", 
        "TTYReset": "no", 
        "TTYVHangup": "no", 
        "TTYVTDisallocate": "no", 
        "TasksAccounting": "no", 
        "TasksCurrent": "18446744073709551615", 
        "TasksMax": "18446744073709551615", 
        "TimeoutStartUSec": "1min 30s", 
        "TimeoutStopUSec": "1min 30s", 
        "TimerSlackNSec": "50000", 
        "Transient": "no", 
        "Type": "forking", 
        "UMask": "0022", 
        "UnitFilePreset": "disabled", 
        "UnitFileState": "enabled", 
        "WantedBy": "multi-user.target", 
        "WatchdogTimestampMonotonic": "0", 
        "WatchdogUSec": "0"
    }
}
META: ran handlers

TASK [Get final state of services] *********************************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/restore_services_state.yml:3
Saturday 17 August 2024  06:24:17 -0400 (0:00:06.168)       0:01:22.169 ******* 
ok: [managed_node1] => {
    "ansible_facts": {
        "services": {
            "NetworkManager-dispatcher.service": {
                "name": "NetworkManager-dispatcher.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "enabled"
            }, 
            "NetworkManager-wait-online.service": {
                "name": "NetworkManager-wait-online.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "NetworkManager.service": {
                "name": "NetworkManager.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "arp-ethers.service": {
                "name": "arp-ethers.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "auditd.service": {
                "name": "auditd.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "auth-rpcgss-module.service": {
                "name": "auth-rpcgss-module.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "autovt@.service": {
                "name": "autovt@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "enabled"
            }, 
            "blk-availability.service": {
                "name": "blk-availability.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "brandbot.service": {
                "name": "brandbot.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "chrony-dnssrv@.service": {
                "name": "chrony-dnssrv@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "chrony-wait.service": {
                "name": "chrony-wait.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "chronyd.service": {
                "name": "chronyd.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "cloud-config.service": {
                "name": "cloud-config.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "cloud-final.service": {
                "name": "cloud-final.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "cloud-init-local.service": {
                "name": "cloud-init-local.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "cloud-init.service": {
                "name": "cloud-init.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "console-getty.service": {
                "name": "console-getty.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "console-shell.service": {
                "name": "console-shell.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "container-getty@.service": {
                "name": "container-getty@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "cpupower.service": {
                "name": "cpupower.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "crond.service": {
                "name": "crond.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "dbus-org.freedesktop.hostname1.service": {
                "name": "dbus-org.freedesktop.hostname1.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "dbus-org.freedesktop.import1.service": {
                "name": "dbus-org.freedesktop.import1.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "dbus-org.freedesktop.locale1.service": {
                "name": "dbus-org.freedesktop.locale1.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "dbus-org.freedesktop.login1.service": {
                "name": "dbus-org.freedesktop.login1.service", 
                "source": "systemd", 
                "state": "active", 
                "status": "static"
            }, 
            "dbus-org.freedesktop.machine1.service": {
                "name": "dbus-org.freedesktop.machine1.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "dbus-org.freedesktop.nm-dispatcher.service": {
                "name": "dbus-org.freedesktop.nm-dispatcher.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "enabled"
            }, 
            "dbus-org.freedesktop.timedate1.service": {
                "name": "dbus-org.freedesktop.timedate1.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "dbus.service": {
                "name": "dbus.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "static"
            }, 
            "debug-shell.service": {
                "name": "debug-shell.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "dracut-cmdline.service": {
                "name": "dracut-cmdline.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "dracut-initqueue.service": {
                "name": "dracut-initqueue.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "dracut-mount.service": {
                "name": "dracut-mount.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "dracut-pre-mount.service": {
                "name": "dracut-pre-mount.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "dracut-pre-pivot.service": {
                "name": "dracut-pre-pivot.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "dracut-pre-trigger.service": {
                "name": "dracut-pre-trigger.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "dracut-pre-udev.service": {
                "name": "dracut-pre-udev.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "dracut-shutdown.service": {
                "name": "dracut-shutdown.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "ebtables.service": {
                "name": "ebtables.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "emergency.service": {
                "name": "emergency.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "firewalld.service": {
                "name": "firewalld.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "fstrim.service": {
                "name": "fstrim.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "getty@.service": {
                "name": "getty@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "enabled"
            }, 
            "getty@tty1.service": {
                "name": "getty@tty1.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "unknown"
            }, 
            "gssproxy.service": {
                "name": "gssproxy.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "disabled"
            }, 
            "halt-local.service": {
                "name": "halt-local.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "initrd-cleanup.service": {
                "name": "initrd-cleanup.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "initrd-parse-etc.service": {
                "name": "initrd-parse-etc.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "initrd-switch-root.service": {
                "name": "initrd-switch-root.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "initrd-udevadm-cleanup-db.service": {
                "name": "initrd-udevadm-cleanup-db.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "iprdump.service": {
                "name": "iprdump.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "iprinit.service": {
                "name": "iprinit.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "iprupdate.service": {
                "name": "iprupdate.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "irqbalance.service": {
                "name": "irqbalance.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "kdump.service": {
                "name": "kdump.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "kmod-static-nodes.service": {
                "name": "kmod-static-nodes.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "messagebus.service": {
                "name": "messagebus.service", 
                "source": "systemd", 
                "state": "active", 
                "status": "static"
            }, 
            "microcode.service": {
                "name": "microcode.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "netconsole": {
                "name": "netconsole", 
                "source": "sysv", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "network": {
                "name": "network", 
                "source": "sysv", 
                "state": "running", 
                "status": "enabled"
            }, 
            "network.service": {
                "name": "network.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "unknown"
            }, 
            "nfs-blkmap.service": {
                "name": "nfs-blkmap.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "nfs-config.service": {
                "name": "nfs-config.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "nfs-idmap.service": {
                "name": "nfs-idmap.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "nfs-idmapd.service": {
                "name": "nfs-idmapd.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "nfs-lock.service": {
                "name": "nfs-lock.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "nfs-mountd.service": {
                "name": "nfs-mountd.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "nfs-rquotad.service": {
                "name": "nfs-rquotad.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "nfs-secure.service": {
                "name": "nfs-secure.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "nfs-server.service": {
                "name": "nfs-server.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "nfs-utils.service": {
                "name": "nfs-utils.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "nfs.service": {
                "name": "nfs.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "nfslock.service": {
                "name": "nfslock.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "plymouth-halt.service": {
                "name": "plymouth-halt.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "plymouth-kexec.service": {
                "name": "plymouth-kexec.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "plymouth-poweroff.service": {
                "name": "plymouth-poweroff.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "plymouth-quit-wait.service": {
                "name": "plymouth-quit-wait.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "plymouth-quit.service": {
                "name": "plymouth-quit.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "plymouth-read-write.service": {
                "name": "plymouth-read-write.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "plymouth-reboot.service": {
                "name": "plymouth-reboot.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "plymouth-start.service": {
                "name": "plymouth-start.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "plymouth-switch-root.service": {
                "name": "plymouth-switch-root.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "pmcd.service": {
                "name": "pmcd.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "pmie.service": {
                "name": "pmie.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "pmie_check.service": {
                "name": "pmie_check.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "static"
            }, 
            "pmie_daily.service": {
                "name": "pmie_daily.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "pmlogger.service": {
                "name": "pmlogger.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "pmlogger_check.service": {
                "name": "pmlogger_check.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "disabled"
            }, 
            "pmlogger_daily-poll.service": {
                "name": "pmlogger_daily-poll.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "pmlogger_daily.service": {
                "name": "pmlogger_daily.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "disabled"
            }, 
            "pmlogger_daily_report-poll.service": {
                "name": "pmlogger_daily_report-poll.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "pmlogger_daily_report.service": {
                "name": "pmlogger_daily_report.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "pmproxy.service": {
                "name": "pmproxy.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "disabled"
            }, 
            "polkit.service": {
                "name": "polkit.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "static"
            }, 
            "postfix.service": {
                "name": "postfix.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "qemu-guest-agent.service": {
                "name": "qemu-guest-agent.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "enabled"
            }, 
            "quotaon.service": {
                "name": "quotaon.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "rc-local.service": {
                "name": "rc-local.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "rdisc.service": {
                "name": "rdisc.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "rescue.service": {
                "name": "rescue.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "restraintd.service": {
                "name": "restraintd.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "rhel-autorelabel-mark.service": {
                "name": "rhel-autorelabel-mark.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rhel-autorelabel.service": {
                "name": "rhel-autorelabel.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rhel-configure.service": {
                "name": "rhel-configure.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rhel-dmesg.service": {
                "name": "rhel-dmesg.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rhel-domainname.service": {
                "name": "rhel-domainname.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rhel-import-state.service": {
                "name": "rhel-import-state.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rhel-loadmodules.service": {
                "name": "rhel-loadmodules.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rhel-readonly.service": {
                "name": "rhel-readonly.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "rngd.service": {
                "name": "rngd.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "rpc-gssd.service": {
                "name": "rpc-gssd.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "rpc-rquotad.service": {
                "name": "rpc-rquotad.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "rpc-statd-notify.service": {
                "name": "rpc-statd-notify.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "rpc-statd.service": {
                "name": "rpc-statd.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "rpcbind.service": {
                "name": "rpcbind.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "rpcgssd.service": {
                "name": "rpcgssd.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "rpcidmapd.service": {
                "name": "rpcidmapd.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "rsyncd.service": {
                "name": "rsyncd.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "rsyncd@.service": {
                "name": "rsyncd@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "rsyslog.service": {
                "name": "rsyslog.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "selinux-policy-migrate-local-changes@.service": {
                "name": "selinux-policy-migrate-local-changes@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "selinux-policy-migrate-local-changes@targeted.service": {
                "name": "selinux-policy-migrate-local-changes@targeted.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "unknown"
            }, 
            "serial-getty@.service": {
                "name": "serial-getty@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "disabled"
            }, 
            "serial-getty@ttyS0.service": {
                "name": "serial-getty@ttyS0.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "unknown"
            }, 
            "sshd-keygen.service": {
                "name": "sshd-keygen.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "sshd.service": {
                "name": "sshd.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "sshd@.service": {
                "name": "sshd@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "systemd-ask-password-console.service": {
                "name": "systemd-ask-password-console.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-ask-password-plymouth.service": {
                "name": "systemd-ask-password-plymouth.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-ask-password-wall.service": {
                "name": "systemd-ask-password-wall.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-backlight@.service": {
                "name": "systemd-backlight@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "systemd-binfmt.service": {
                "name": "systemd-binfmt.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-bootchart.service": {
                "name": "systemd-bootchart.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }, 
            "systemd-firstboot.service": {
                "name": "systemd-firstboot.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-fsck-root.service": {
                "name": "systemd-fsck-root.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-fsck@.service": {
                "name": "systemd-fsck@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "systemd-halt.service": {
                "name": "systemd-halt.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-hibernate-resume@.service": {
                "name": "systemd-hibernate-resume@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "systemd-hibernate.service": {
                "name": "systemd-hibernate.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-hostnamed.service": {
                "name": "systemd-hostnamed.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-hwdb-update.service": {
                "name": "systemd-hwdb-update.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-hybrid-sleep.service": {
                "name": "systemd-hybrid-sleep.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-importd.service": {
                "name": "systemd-importd.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-initctl.service": {
                "name": "systemd-initctl.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-journal-catalog-update.service": {
                "name": "systemd-journal-catalog-update.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-journal-flush.service": {
                "name": "systemd-journal-flush.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-journald.service": {
                "name": "systemd-journald.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "static"
            }, 
            "systemd-kexec.service": {
                "name": "systemd-kexec.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-localed.service": {
                "name": "systemd-localed.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-logind.service": {
                "name": "systemd-logind.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "static"
            }, 
            "systemd-machine-id-commit.service": {
                "name": "systemd-machine-id-commit.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-machined.service": {
                "name": "systemd-machined.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-modules-load.service": {
                "name": "systemd-modules-load.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-nspawn@.service": {
                "name": "systemd-nspawn@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "disabled"
            }, 
            "systemd-poweroff.service": {
                "name": "systemd-poweroff.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-quotacheck.service": {
                "name": "systemd-quotacheck.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-random-seed.service": {
                "name": "systemd-random-seed.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-readahead-collect.service": {
                "name": "systemd-readahead-collect.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "systemd-readahead-done.service": {
                "name": "systemd-readahead-done.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "indirect"
            }, 
            "systemd-readahead-drop.service": {
                "name": "systemd-readahead-drop.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "enabled"
            }, 
            "systemd-readahead-replay.service": {
                "name": "systemd-readahead-replay.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "enabled"
            }, 
            "systemd-reboot.service": {
                "name": "systemd-reboot.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-remount-fs.service": {
                "name": "systemd-remount-fs.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-rfkill@.service": {
                "name": "systemd-rfkill@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "systemd-shutdownd.service": {
                "name": "systemd-shutdownd.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-suspend.service": {
                "name": "systemd-suspend.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-sysctl.service": {
                "name": "systemd-sysctl.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-timedated.service": {
                "name": "systemd-timedated.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-tmpfiles-clean.service": {
                "name": "systemd-tmpfiles-clean.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-tmpfiles-setup-dev.service": {
                "name": "systemd-tmpfiles-setup-dev.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-tmpfiles-setup.service": {
                "name": "systemd-tmpfiles-setup.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-udev-settle.service": {
                "name": "systemd-udev-settle.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "static"
            }, 
            "systemd-udev-trigger.service": {
                "name": "systemd-udev-trigger.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-udevd.service": {
                "name": "systemd-udevd.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "static"
            }, 
            "systemd-update-done.service": {
                "name": "systemd-update-done.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-update-utmp-runlevel.service": {
                "name": "systemd-update-utmp-runlevel.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-update-utmp.service": {
                "name": "systemd-update-utmp.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-user-sessions.service": {
                "name": "systemd-user-sessions.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "systemd-vconsole-setup.service": {
                "name": "systemd-vconsole-setup.service", 
                "source": "systemd", 
                "state": "stopped", 
                "status": "static"
            }, 
            "teamd@.service": {
                "name": "teamd@.service", 
                "source": "systemd", 
                "state": "unknown", 
                "status": "static"
            }, 
            "tuned.service": {
                "name": "tuned.service", 
                "source": "systemd", 
                "state": "running", 
                "status": "enabled"
            }, 
            "wpa_supplicant.service": {
                "name": "wpa_supplicant.service", 
                "source": "systemd", 
                "state": "inactive", 
                "status": "disabled"
            }
        }
    }, 
    "changed": false
}

TASK [Restore state of services] ***********************************************
task path: /tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/restore_services_state.yml:9
Saturday 17 August 2024  06:24:18 -0400 (0:00:01.069)       0:01:23.238 ******* 
skipping: [managed_node1] => (item=pmcd)  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "pmcd", 
    "skip_reason": "Conditional result was False"
}
skipping: [managed_node1] => (item=pmlogger)  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "pmlogger", 
    "skip_reason": "Conditional result was False"
}
skipping: [managed_node1] => (item=pmie)  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "pmie", 
    "skip_reason": "Conditional result was False"
}
skipping: [managed_node1] => (item=pmproxy)  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "pmproxy", 
    "skip_reason": "Conditional result was False"
}
skipping: [managed_node1] => (item=redis)  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "redis", 
    "skip_reason": "Conditional result was False"
}
skipping: [managed_node1] => (item=grafana-server)  => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "item": "grafana-server", 
    "skip_reason": "Conditional result was False"
}
META: ran handlers
META: ran handlers

PLAY RECAP *********************************************************************
managed_node1              : ok=54   changed=23   unreachable=0    failed=0    skipped=36   rescued=0    ignored=0   

Saturday 17 August 2024  06:24:18 -0400 (0:00:00.088)       0:01:23.327 ******* 
=============================================================================== 
fedora.linux_system_roles.private_metrics_subrole_elasticsearch : Install needed Elasticsearch metrics packages -- 26.43s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/tasks/main.yml:41 
fedora.linux_system_roles.private_metrics_subrole_pcp : Install Performance Co-Pilot packages -- 12.24s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/main.yml:27 
fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric logging is restarted and enabled on boot --- 6.20s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmlogger.yml:60 
fedora.linux_system_roles.private_metrics_subrole_pcp : Restart pmlogger --- 6.17s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/handlers/main.yml:19 
fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure extra performance rules are installed for targeted hosts --- 3.79s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:24 
fedora.linux_system_roles.private_metrics_subrole_pcp : Install authentication packages --- 3.33s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/main.yml:33 
fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure extra rules symlinks have been created for targeted hosts --- 1.90s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:86 
fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric collector is restarted and enabled on boot --- 1.64s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:117 
Gathering Facts --------------------------------------------------------- 1.47s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/tests_verify_from_elasticsearch.yml:9 
fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure extra performance rule group link directories exist --- 1.47s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:14 
fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure extra performance rule group directories exist --- 1.45s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:4 
Get initial state of services ------------------------------------------- 1.36s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/get_services_state.yml:3 
Get final state of services --------------------------------------------- 1.07s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/tests/metrics/restore_services_state.yml:3 
fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric inference is restarted and enabled on boot --- 0.91s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmie.yml:127 
fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric collector system accounts are configured --- 0.88s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:60 
fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure any implicit metric labels are configured --- 0.73s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:46 
fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric collector is configured --- 0.72s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:53 
fedora.linux_system_roles.private_metrics_subrole_elasticsearch : Ensure PCP Elasticsearch agent is configured --- 0.71s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_elasticsearch/tasks/main.yml:55 
fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure any explicit metric labels are configured --- 0.67s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:39 
fedora.linux_system_roles.private_metrics_subrole_pcp : Ensure performance metric collector authentication is configured --- 0.64s
/tmp/collections-NLJ/ansible_collections/fedora/linux_system_roles/roles/private_metrics_subrole_pcp/tasks/pmcd.yml:86