Posts: 7
Joined: 30.Dec.2004
From: Toronto
Status: offline
Good Evening everyone,
I'm just looking for some advice on a good URL block list. There are a number of place on the internet that have such files. I just looking for from people experience what list are good to use.
Right now I just been adding site as I see users accessing them. Would be nice to import a big list or URLS.
I got my list from from squidguard. If i remember correctly it is a unix based list and needs some reformating.
When creating my blocked list i created a seperate Domain Name set for each letter of the alphabet and the created a different rule for each domain name set. This allows for better management of sites if you find a site that gets blocked accidently, you can look in the log file and see which rule blocked the site .
The other thing that i did that made things a lot easier was i created my Domain Name Sets and did not add anything to them, i then exported those settings and modified the XML file and re-imported. A simple copy and paste instead of having to type each site in one by one.
Posts: 271
Joined: 5.May2001
From: Redmond, WA
Status: offline
The fact is, the fewer rules you have, the more efficient the ISA policy search engine is.
You should create a single "noway" rule that blocks these lists in one shot. ..of course, if you have 50,000 entries in a single URLSet, then you might need to review them for duplicates.