I need a image spider that downloads an entire website to my local drive and then can upload the images I choose to a database on another website. See attached jpg of how it might look.
Here is how it will work. I give it a URL...it downloads all the images from that URL. It also records the URL where images were located at.
Once website is downloaded...The spider allows me to view thumbnails of images it has recorded. I can then select the images I want, categorize them in an online data base ( DB will be provided for you to interface with). The program must then resize image ( create thumbnail )...and FTP the thumbnail and url where image was located at orginally to the database into the category i specify in image spider. The program will then automatcally delete those images out of que. And I can then check another batch of images to catalogue.
Attached is a sample of what the program might look like. The red is an online database that is already running and working. You will need to interface with it. That is what the user will use to categorize where images should be uploaded too.
I need you to create what is below the red. You can use a preexisting image spider to build upon. This will not be a commercial app and only used privately. I just care that it works.
If this is a project that interests you. Please bid below.