I need to scrap over 100 GB data from site (about 250.000 files).
Scrapper is writen in Perl and work best in Linux but you can also run it on Windows.
I require to complete the project in max. 30 days so you must download at least 3GB/day.
If you have T1 line in work/university thats great.
I will provie you HDD (or money) to save data and will cover
post office cost.
I require to send me every day [url removed, login to view] file (dir/s>[url removed, login to view])
as proof that you download at least 3GB/day.
5 freelancers are bidding on average $90 for this job
hi i have a dedicated server in a LA datacenter. I already have the space and bandwidth available. If your scraper will just need to run in the background, i can do the job. You can download the results from WWW or Daha Fazla
Hello, We are a team members having good skills in Web Designing and web development, data entry, chat and email support and well experienced in this field. We also give you assurance of completing your given task in Daha Fazla