I want a program that calls a specific static URL, extracts data from that URL to determine qualifying pages to access, then constructs URLs based on a known pattern, calls, and captures a set of XML files for each qualifying page. The program must be launched from my Windows XP machine at a specified time daily using native Windows applications or free/open source software.
You can use java or PERL or whatever makes sense.
A detailed spec has been written describing exactly what to do, the URL format needed (with examples), and how the output is expected to be written. This should be a very easy project for people with scripting experience.