I'm using ISA 2006. Right now, we use Proventia web filter. I'm not at all pleased with them and they are phasing out support for ISA anyway. I hope to use ISA along with a few other tools to do my blocking. I'm attempting to use URLBlacklist.com as part of my solution. As a test, I've downloaded their list of sites. I have my rule and URLset. I want to dump one of their lists into my URLsets. We built a script to do this. However, on the adult site lists, the files from URLBlacklist are 5 mb to 15 mb. I've been exporting the URLset from ISA into an HTML document. Then I run our script which takes the URLBlacklist file and dumps it into the HTML document. On those large files though, this takes overnight and sometimes longer. During this time the computer or server stays at 100% CPU trying to move the information. That isn't a good solution. Has anyone come up with any sort of solution to a problem close to this? Thank you in advance if you have any ideas.