1. a script which I will host on my server
2. to download a lists of dynamic webpages
3. using anonymous proxies
4. by switching the proxy for each page request
5. and at the end of the project, zip all downloaded files into one file for easy download.
The script will be launched through a web-page which will have a form where I can input project name and upload a text file. The text file will have 1000s of URLs of dynamic pages like:
[url removed, login to view]
[url removed, login to view];State=gigcsvvd
Sometimes the URLs can be of secure pages https://
Now, I use wget to download the pages. But my requirements are growing and it wont be possible for me to manage the same way for long.
It is very important for the script to use anonymous proxies and keep changing them dynamically as the web-servers will deny repeated requests from the same IP.
The script should support multiple projects and maintain a log. It should update the status of each record which can be done by updating the same in the text file that I upload.
Thanks for your interest.