I need to work with a perl expert to achieve the following:
after a user sends a http request, 30 perl threads are called to crawl 30 sites. crawled webpages (info) are displayed. Need to take the following into consideration:
1. what if there are multiple user requests?
2. what if there are no response from some sites?
3. what if connection times out (and needs to wait for a long time) from some sites?
Please only bid if you have similar prototype/demo to show me...
7 freelancers are bidding on average $1000 for this job
I can do it in three days, provided with site URLs and the description of what needs to be extracted.
Dear Sir, I am experienced perl developer. I have already completed a nmber of perl related projects at GAF. Best regards, Nadeem
I'd need a better idea of what kind of information you're attempting to retrieve from the sites - the basic functionality is doable though. Threading is not a problem, nor necessarily "unsafe" for this purpose. I'v Daha Fazla
Multithreaded Web Scraping is easy to accomplish with the POE set of Perl modules. More information would be required.