Long Term Query Log -> Excel Pivot Chart

Hi,

I know this topic was already discussed once, but not really finally answered in my perspective.

The Question I have is: is there a way to get a huge amout of data extracted from the long term query log into CSV or anything else? So I can import it into Excel and create a Pivot out of it.
I want to do further anlysis of the data, but the web frontend is always crashing if I want to view some more days.

This would also be a feature request from my side :slight_smile:
(Create something like a "download" button in the menu "long term data", so you can download the current view in a CSV or some other kind of logfile.

But anyway, is there a way to get such logfile (for the last 30 days for example) ?

Regards,

V.

1 Like
2 Likes

hmmm, okay thanks for this at first.

But I think my basic knowledge of Linux and SQL is not enought to understand any of those steps in the tutorial :confused:
Is there maybe somethink I can do via SSH to create some kind of a logfile or something like that?
I mean the raw data is there, I can see it in HTML, but its too large to make a copy/paste.
There must be an easier way (I hope).

V.

you could copy the DB to a non-root user's /home and then SSH in and copy it to another machine to work on it with sqllistestudio. the HTML files are very large but the CSV export takes up less space

You could try to increase php‘s memory

ok, that sounds like a possible solution.
Do you maybe know where I can find this file (Ubuntu Core OS - if this helps) ?

@yubiuser : thats not the problem. The HTML view itself works fine (since increasing PHP memory long time ago). But I cannot extract the data from web page to excel. Of course copy paste works fine, but only as long you don't have hundrets of pages. So I thought there is some kind of an export function

sqlite3 /etc/pihole/pihole-FTL.db -header -csv "Select * from queries where timestamp > strftime('%s','now','-30 days');" > ./export.csv

2 Likes

ok, this works fine.
Thanks a lot for the Help!

I need to deep dive now into raw data :nerd_face:

but still: would'nt this be a nice feature for future implementation? I mean the function itself to create the HTML content is obviously there. So from my perspective it must be easy to implement some kind of an export function. Just like the teleporter function - to download a tar.gz or something.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.