This application should have the capability of browsing a given URL and downloading the contents from that URL and create a copy in the local system. For example we specify the target URL [login to view URL] and specify the depth from the starting URL the web spider is supposed to get the pages. The application should navigate all the links from the initial URL to the specified depth and download the pages including the pictures, images sounds, animations etc to a local folder which can be browsed later without connecting to the internet.
Required features:
" The application should be user friendly.
" Each download should be saved as a project. Projects shall support operations like new, open, save as and close.
" Application should display the open project and its details preferably in a tree view
" Each download should support operations like pause, resume, stop and abort.
" User should have the option to download only text/text and graphics/everything.
" It should show the statistics of each download number of pages requested, percentage completed/remaining, time elapsed/remaining.
" If some sites requires authentication, so the facility to provide the username and password.
" Facility should be there for the user to restrict the download to the same domain. For example if we are downloading contents from [login to view URL] it may have links to other sites which should not be downloaded.
" Downloading should not be synchronous.