I'm looking for a person who can download all records from a website, parse it, and store in a MySQL database. Since this website is updated quite often, this project should ideally be developed in a form that I can run it here once a week and get all records inserted in my local database again.
Here is an example: [url removed, login to view]
In this page you find all information about this fighter, his height, birthday, weight, and most importantly, his previous opponents and the results of his fights. By following the links to the pages of his opponents, we have the same information about them. There is no need to download the information about the events, only the fighters. Therefore, you can restrict your downloads to the "/fighter" subdirectory.
The database might be quite large (~100,000 records or more, so the initial download might take a while.
Prefered languages are PHP, Perl, Bash, Python, or any combination of those. The database of choice is MySQL.
Looking forward to see your bids !
32 freelancers are bidding on average ¥20370 for this job
Hello Sir/Madam, I am an expert in web scraping (I rank #1 for Web Scraping [url removed, login to view]). I have done many similar jobs. Ready to start Thanks, Alex
Dear, potential employer. Perl/Web/MySQL professional here. Please, accept this bid to have your task done with the best professional quality, however, with reasonable price, Looking forward for response from you.
Hello, Team of perl/Web/Databases professionals here. We would be really glad to complete this tasks for you with the best professional quality. Looking forward for update from you.