I know this topic was already discussed once, but not really finally answered in my perspective.
The Question I have is: is there a way to get a huge amout of data extracted from the long term query log into CSV or anything else? So I can import it into Excel and create a Pivot out of it.
I want to do further anlysis of the data, but the web frontend is always crashing if I want to view some more days.
This would also be a feature request from my side
(Create something like a "download" button in the menu "long term data", so you can download the current view in a CSV or some other kind of logfile.
But anyway, is there a way to get such logfile (for the last 30 days for example) ?
But I think my basic knowledge of Linux and SQL is not enought to understand any of those steps in the tutorial
Is there maybe somethink I can do via SSH to create some kind of a logfile or something like that?
I mean the raw data is there, I can see it in HTML, but its too large to make a copy/paste.
There must be an easier way (I hope).
you could copy the DB to a non-root user's /home and then SSH in and copy it to another machine to work on it with sqllistestudio. the HTML files are very large but the CSV export takes up less space
ok, that sounds like a possible solution.
Do you maybe know where I can find this file (Ubuntu Core OS - if this helps) ?
@yubiuser : thats not the problem. The HTML view itself works fine (since increasing PHP memory long time ago). But I cannot extract the data from web page to excel. Of course copy paste works fine, but only as long you don't have hundrets of pages. So I thought there is some kind of an export function
but still: would'nt this be a nice feature for future implementation? I mean the function itself to create the HTML content is obviously there. So from my perspective it must be easy to implement some kind of an export function. Just like the teleporter function - to download a tar.gz or something.