Long term data analysis options

I do a weekly scan through the logs using 'long term data / query log' and look for suspicious domains that slipped through. I'll generally try and filter using things like 'geo', 'location', 'telemetry', 'ads', etc. but that can generate a very long list with a lot of repeated domains, e.g. thousands to watson.telemetry.microsoft.com. What would make this so much easier would be a way to show only unique domains from the filter, perhaps with a counter showing the actual hit count?

Another useful feature would be a way to jump to a specific time window, e.g. +/- 10 seconds, from a given entry in the filtered list. That could help isolate the surrounding context of the query and give some insight into which site or application might have instigated it.

Hopefully it's ok to put these in the same suggestion. Thank you for all your hard work on this amazing project.

The first request is covered by

and

I would like to ask you to remove the first part of your FR and keep just the wish for a way to jump to a specific time window. Please rename the title accordingly.

Ok, can do. Sorry for the slow reply. I log in seldomly and post even less frequently! :slight_smile: I couldn't see an option to edit my post title so I created a new one and I'll just delete this. I think it's pending approval though.

Thanks.