Curating blocked-lists using stats

Was thinking about adding a field to totalize the number of times that a site was blocked, then at 3AM local do a sort on the list to move the most-blocked to the top and so on. I haven't looked at the code but I would imagine
that even if the list is segmented alphabetically or some other way that getting early matches would improve
response times.

I've moved this to a Feature Request, that way the community can vote on it and we can rank it on the list of requests.

Thanks!

1 Like

Hi Dan,

I did just take some time to study the FTL code and I have to confess that I was unable to make out whether (or where) hashing is used to speed up looking for blocked domains. Hashing would make my suggestion less relevant (as stated, anyway). There probably is a way to make the lookup more efficient for a set of domains
that are more often encountered, and that list may depend on location. For instance, different countries. In a public setting that could be useful in terms of the ebb and flow of popular activity over time in a region. So I would modify the suggestion to something more general than re-ordering (sort) of the list, which reflected a simplistic understanding of what goes on under the hood.
[UPDATE]
Right after this edit the light bulb went off and I realized that blocked domains are not treated in a fundamentally different way than others, rather they don't have IP addresses but all zeros or NXDOMAIN and go into the hopper along with other domains and that means there already is a priority cache operating because that's just a natural function of any DNS that wants to resolve fast. I suppose that must be handled by SQLITE?

Thanks for looking at my idea. Now that I understand FTL better it seems naive.

1 Like