Long term data analysis options

I do a weekly scan through the logs using ‘long term data / query log’ and look for suspicious domains that slipped through. I’ll generally try and filter using things like ‘geo’, ‘location’, ‘telemetry’, ‘ads’, etc. but that can generate a very long list with a lot of repeated domains, e.g. thousands to watson.telemetry.microsoft.com. What would make this so much easier would be a way to show only unique domains from the filter, perhaps with a counter showing the actual hit count?

Another useful feature would be a way to jump to a specific time window, e.g. +/- 10 seconds, from a given entry in the filtered list. That could help isolate the surrounding context of the query and give some insight into which site or application might have instigated it.

Hopefully it’s ok to put these in the same suggestion. Thank you for all your hard work on this amazing project.