Devam Ediyor

Perl function to fetch html code

Hi,

I need a perl function "grab" that retrieves all URLs in @urls and put their contents into an array (where $array[0] holds the content of $url[0], etc). If timeout is exceeded, content will be "timeout". It's important that this function can very efficiently deal with a large number of URLs. In the past I've used LWP::Parallel::UserAgent to accomplish this.

I'd like to use the code as shown in the script example below, so please formulate your function so that I can drop it into the script below and it will work. Thank you.

Also, please take a look at the Google answer solution page at [url removed, login to view] . It shows the solution I've obtained for the same issue back in 2004, using LWP::Parallel::UserAgent. Looking at it might give you some useful pointers. However, the solution on that page doesn't work anymore because of a recently emerged problem with LWP::ParallelUserAgent which is explained at [url removed, login to view] (it's because of that problem I've posted this project here). THe script will have to run in a [url removed, login to view] environment and current it seems that LWP::ParallelUserAgent is not working with the pre-installed version of LWP on Bluehost. So it would be preferable to have a solution that does not depend on LWP::ParallelUserAgent at all but still is capable of quickly retrieving a large number of URLs.

Thank you.

Marc.

#!/usr/bin/perl

@urls=("[url removed, login to view]", "[url removed, login to view]", "[url removed, login to view]");

$timeout = 20; # each request times out after 20 seconds)

@content = &grab(@urls);

print "everything grabed";

exit;

sub grab {

# your grab function here

}

Beceriler: Perl

Daha fazlasını görün: www code org, number function, id to code, function number, example of number problem with solution, deal news, code your, code 30, bluehost com, perl function fetch, marcfest, html code, Google code, function, fetch, d bin, script function, url html, perl request, code function, html number, retrieving, html useragent, html issue, timeout function

İşveren Hakkında:
( 84 değerlendirme ) Miami Beach, United States

Proje NO: #700154

Seçilen:

recremd

Please consider my bid, and see the project clarification board for additional information. I am an experienced developer with over 10 years of professional internet development experience. I have written countless Daha fazlası

1 gün içinde 30$ USD
(2 Değerlendirme)
1.6

11 freelancer bu iş için ortalamada 83$ teklif veriyor

gangabass

I can do this job for you. See PM for details.

1 gün içinde 30$ USD
(147 Değerlendirme)
5.9
srinichal

memory usage could be really high for this project

in 4 gün içinde120$ USD
(26 Değerlendirme)
5.6
freelance4hire80

i can do this

in 0 gün içinde30$ USD
(25 Değerlendirme)
5.2
SigmaVisual

We can help in your project, please check PMB and our ratings/reviews to get idea of our experience.

in 5 gün içinde225$ USD
(13 Değerlendirme)
5.3
amsak

Placing bid

in 15 gün içinde30$ USD
(5 Değerlendirme)
3.0
freelancefoo

Hey Marc, Im new to freelancer but have 13 year Perl experience & botting/scraping. Pls view my PM for more details. Thanks! Dan

in 5 gün içinde100$ USD
(1 Değerlendirme)
2.0
robinbowes

Please see PM. Thanks, R.

1 gün içinde 50$ USD
(2 Değerlendirme)
1.7
sailavish

Hi I have 6+ years of experience using Perl (LWP, Object oriented perl, APIs from CPAN), PHP and MySQL to build web applications and implement middle tier as well as backend objects. I feel this project is an idea Daha fazlası

in 4 gün içinde150$ USD
(0 Değerlendirme)
0.0
manishbhaumik

10+ yrs of IT experience with enough exposure to similar perl programs. Please see the PMB for clarifications.

in 3 gün içinde50$ USD
(0 Değerlendirme)
0.0
macnod

I will write a multithreaded, command-line Web client that can fetch data from multiple URLs concurrently. I will write this program quickly and I will provide an option to have it read the URLs from a file (in addi Daha fazlası

1 gün içinde 100$ USD
(0 Değerlendirme)
0.0