Our real estate website downloads data on homes for sale from several MLS (Multiple Listing Services). The data generally comes from an FTP site in pipe delimited format. We need a company to maintain the data and associated photos on our website on a daily basis. Ideally the process would be automated to
1) FTP and download the data and photos each day
2) Import it into databases (one for each MLS)
3) Maintain the database and delete associated image files as properties are sold and removed from the data feed.
Each of the 7 current MLSs has a little variance in the way the data is formatted. Most deliver the data and images by FTP, however, one uses the RETS system ([url removed, login to view]). The data files are sometimes quite large and require us to import the file into Quattro Pro (which doesn't have the 256 column limitation of MS EXCEL) and convert the file into smaller CSV files for upload. You may know of a better way to do this. Currently we are doing this manually using php myAdmin and it is quite time consuming.
Most of the data feeds are complete dumps each day, so we just EMPTY the table and import the new data each day. However, one is an incremental dump that comes with a reference file. The reference contains the Listing ID numbers of the properties that are currently for sale. It needs to be compared to the table and the listings that no longer have associated reference numbers in the reference file need to be deleted.
The photo images need to be compared by MLS number or Listing ID to the records in the current property data file. Those that no longer have associated active listing need to be deleted.
We plan to add additional MLS data feeds over the coming months.