I need to work with a perl expert to achieve the following:
after a user sends a http request, 30 perl threads are called to crawl 30 sites. crawled webpages (info) are displayed. Need to take the following into consideration:
1. what if there are multiple user requests?
2. what if there are no response from some sites?
3. what if connection times out (and needs to wait for a long time) from some sites?
Please only bid if you have similar prototype/demo to show me...