I upgraded my ISA 2000 server from Win2000 to Win2003 per KB331062 (the section concerning upgrades as well as the section dealing with known issues). Since I've upgraded the server, an application that successfully used a firewall session to complete a transaction no longer works. The application, running on a Win2003 Terminal server was not changed, nor was the terminal server. The first FW log snippet is from before the upgrade (successful):
according to the log excerpts the only difference is that in the unsuccessful log no bytes seems to have been transfered. However, at the connection level, all seems to work (connect 0 and connect 20000).
For more info about the log entries, check out: - ISA helpfile, section Firewall and Web Proxy log fields. - ISA helpfile there is a section called Firewall and Web Proxy log fields.
Update: I broke down and forked over $245 to Microsoft to spend 8 hours on the phone just to have them tell me that upgrading a Win2000 running ISA server to Win2003 server running ISA is not recommended and that I will need to uninstall/reinstall ISA to correct it. Has anyone come across MS documentation that asserts the "upgrade not recommended" claim I'm being fed?
The MS tech explained that problem this way: I filtered the the NETMON trace to the primary address on the external NIC. When the request went out, an incorrect MTU size created a "Black Hole Router" condition, which caused the request to iterate through all of the IP addresses bound to my external NIC, trying to find a way out. The NETMON trace captured the failed request on the last IP address bound to the external NIC. Oddly enough, changing the MTU size in the registry and reboot had no effect on the problem nor it's symptoms, so the tech had me undo all of the changes and that's when the "reinstall" issue arose. I prefer to do clean installs, but sometimes, you find your hands tied when it comes to this because someone higher on the food chain reads a glowing success story in a I.T. trade mag about the pro's of upgrading, without adequately covering potential problems. It's a case of "So it is written, so it shall be". All of us have been there. It's just my turn today
quote:I prefer to do clean installs, but sometimes, you find your hands tied when it comes to this because someone higher on the food chain reads a glowing success story in a I.T. trade mag about the pro's of upgrading, without adequately covering potential problems. It's a case of "So it is written, so it shall be".
In that case he can't complain it isn't working because you have followed his "smart" advice!
WOW! Where do I begin? After disappearing into Microsoft's hold queue this weekend for over 4 hours and getting misrouted, I finally found someone that knew what they were doing. If you ever get BillB in the ISA queue, consider yourself lucky. As if turns out, we were able to trace 2 of the 3 main problems down and, they are good ones. It took the better part of 12 hours on the phone along with the support rep gaining remote control of our ISA server yesterday to get back online. Among the more interesting items: We removed all IPs bound to our onboard NIC (external), disabled the device in the Win2k3 OS, replaced it with an Intel NIC, bound thew previously removed IPs to the new NIC and rebooted. We were initially unable to disable the onboard adapter in the BIOS. These 2 events led to a stack problem because packet filtering was enabled on the ISA server and those filters were bound to the old NIC that was disabled in the OS, but enabled on the mainboard. The packet filters were bound to an adapter that was disabled but still enumerated in registry and had an address of 0.0.0.0 which led to ALL of our outbound DNS traffic to be broadcast traffic. The solution appeared to be this: export ISA config, uninstall and reinstall ISA, and finally, reimport the settings. This failed because the ISA uninstallation routine left the packet filters (and apparently only the packet filters) in the registry. Reinstalling ISA failed to coerrect the problem. I belatedly discovered the "rmisa.exe" tool in the \i386 folder on the ISA CD. I used this tool to uninstall ISA and I reinstalled it. It only partially worked. The rules remained, but the bindings to the disabled NIC were broken. Unfortunatley, so was one packet filter rule. The export/import process whacked our "block" rule for NetBIOS over the external interface. Our DNS rule was intact, but our NetBIOS rule now blocked DNS. How could this possibly happen you ask? Well, so did I. Bill couldn't answer it yesterday. After 12 hours, I was happy to have our server back up and working and very happy to have had Bill tweak our DNS config in the process. THis morning, after running all night successfully, I decided to disable our anon outbound access Site and Content rule and replace it with something more restrictive. After restarting the services, we lost all external name resolution. I restored the anon rule and restarted the services. Still no DNS. Only rebooting the ISA server restored DNS resolution. We looked at logs. We looked MPS reports. We looked at event logs. Nothing. Finally, Bill noticed that we were using quite a bit of our pagefile. The supposition now is all of the flaky events were caused by insufficient memory (we have 384M). We have now made plans to offload our IIS server, our AV server and increase the RAM. Tonight, I will be testing the ISA server to recreate the problem to see if we can narrow it down definitively to a insufficient memory situation. Hopefully, the next post will far less entertaining than this one.