Analyzing data from my Pi-hole

Hello. My name is Bret, and I am new here. Looking forward to contributing and participating.

Recently downloaded ninety one days worth of DNS queries from my Pi-hole. Which ended up being 1.3 million records, or a one hundred and fifty megabyte CSV file.

To better understand the data, I am experimenting with various visualizations using Python and D3. Resulting in graphs, such as the one below. Which is a streamgraph featuring the number of times I have pinged one hundred particular domains during that period of time.

Has anyone else done something like this? Or, are there any resources for better understanding the DNS query data I am now exploring?

Thank you for the tool. I am glad to be a member of this community.

1 Like

I'm using grafana cloud (prometheus/loki/grafana stack, in a cloud) with good results for collecting metrics/logs and visualizing. You'd be able to enable query logging and derive this same info. Grafana alloy has good pi support.

The query stats here look like the pihole admin page but they're coming out of the query logs themselves. I could be doing this per-domain requested, per-client, etc.

Grafana cloud's free service includes PLENTY of logs (50GB). Log retention is 14 days, but if you use recording rules to derive these stats automatically, metrics are retained longer. The pro service is free if you stay within the free limits (which a single pihole would totally do), and then you get a year of metrics retention.

2 Likes

I've posted both the exporter and dashboard thus far if anyone's interested.

https://github.com/bazmonk/pihole6_exporter

1 Like

@bazmonkey I experimenting with something similar with the Grafana (local) - Prometheus - Node_Exporter combo running under a Docker Portainer Stack to collect Pi-hole metrics. I have been sucessful except for the exporter part as there appears to be no native support in Pi-hole for this. The current Node_Exporter, running on my two Pi-hole Pis has no trouble collecting system metrics, but cannot do the same for Pi-hole specific ones. Are you having any better luck getting them directly from Pi-hole since you made this post?

1 Like

I messed around with getting Pi-hole metrics out from a RPi 5, and into my local Grafana instance. Here is what the dashboard ultimately looks like:

I think it is a rather nice interface.

The tools I relied on to make this happen include:

  1. pihole-exporter
  2. Dashboard id #10176

Once I had the pihole-exporter streaming logs, I imported Grafana dashboard #10176, which fits the data like a glove.

I am happy to provide the specific commands used and troubleshooting steps taken along the way to displaying real-time Pi-hole data in Grafana.

2 Likes

I think you did a great job and this is exactly what I was looking for ... so yes, if you are willing to provide the commands you used and your troubleshooting steps taken, I would be very grateful. Thank you.

I am happy to help.

These are the general steps I took to visualize my Pi-hole data using Prometheus and Grafana, which may or may not work for you. As with everything you read on the internet, please proceed with caution and at your own risk.

The following instructions assume you already have:

  1. Docker installed on your RPi
  2. Basic knowledge of Linux and the command line

Step 1

Run the following command in a terminal attached to your Raspberry Pi machine:

docker run -d \
  --name pihole-exporter \
  -p 9617:9617 \
  -e PIHOLE_PASSWORD="YOUR_PIHOLE_PASSWORD" \
  -e PIHOLE_HOSTNAME="localhost" \
  --restart=unless-stopped \
  ekofr/pihole-exporter:latest

Please change the value for PIHOLE_PASSWORD to your own password.

Step 2

Verify the exporter is running by entering the following command in the same terminal as the previous step:

curl http://localhost:9617/metrics

Step 3

Once you have verified the Pihole Exporter is producing logs, add the following to the prometheus.yml file on your Prometheus machine:

scrape_configs:
  # ... your other jobs may be here ...

  - job_name: 'pihole'
    scrape_interval: 15s
    static_configs:
      - targets: ['<YOUR_RPI5_IP_ADDRESS>:9617']

Make sure to change the <YOUR_RPI5_IP_ADDRESS> key to your Raspberry Pi's IP address.

Step 4

After you update your prometheus.yml file, restart Prometheus with the following command:

systemctl restart prometheus

Log into the Prometheus web UI and confirm it is tracking the Pi-hole feed among its targets.

Step 5

Login to the Grafana web UI and import a new dashboard for the Pi-hole data source using id #10176.

As mentioned at the top of this post, these are quick summaries of the major steps I took. And it works!

If you run into any particular challenges while attempting to do the same thing, I am happy to help debug as best I can.

Good luck!

1 Like

Thank you! My set up is a bit different, but your step-by-step with a bit of modification, worked perfectly. Again, thank you very much!

1 Like

That is wonderful to hear. I am glad the instructions helped in some way.