I need to do the following. I assume bash is best.
1) Download 400,000 images from a remote FTP server (I have access/password for it) and resize these images. I should be able to run this script at any time to retrieve new/updated images (-n).
2) A 2nd (?) script should download all scripts from a MMDDYYYY folder that is created on the remote FTP server each day. These images should also be resized.
3) All images need to have a 2nd, smaller copy made. Images are named like [url removed, login to view], so we need to also make a smaller copy called [url removed, login to view], for example.
4) Most importantly, we need come up with a storage structure for these images so we don't have 800,000+ images in one folder. These images are pulled dynamically from a webpage. In other words, the page associated with database record 123456 will load image [url removed, login to view] into the page. So, you need to devise a system that can determine where [url removed, login to view] is stored so the page associated with 123456 can find it.
Note that some records have multiple files, like [url removed, login to view], [url removed, login to view], etc.
I have existing scripts that I can send you that import and resize the files (both the mass import and the daily MMDDYYYY import). So essentially you will need to make it so the images are stored optimally.
It is extremely important that this script be written to run as efficiently and quickly as possible. I am using this project as sort of a 'test' to add another developer to my team. If you do a good job on this I will have more jobs for you immediately. I'm looking for someone who is extremely good, fast, good at communicating, and who does what they say they will do when they say they will do it. I'm easy to work with if you are easy to work with :)
I'd like this delivered within 48 hours of bid selection. 100% in escrow to start the project. Tell me what your ideas are for how to do this and let's get started.