I have a SFTP server with hundreds of files, with new file(s) added daily. I also have a server holding "processed" files, originating from the source server, which gets new files every day from the source.
My problem is that every day I have to find the newest files (ones not yet processed) and get them from the SFTP server, and copy them into a DIFFERENT folder than where I keep the processed files on the local server. (Think of it as a queue for further internal processing.) I don't have the time or bandwidth to download more than only the files that I actually need and haven't yet processed.
I essentially want to do a one way "synchronize" function but have the new/different files copied to a different directory than the one it's comparing against for synchronization. Is this possible?
One other hitch, I have to use SSIS and winscp.com scripting to get it done.
In the past, when connecting to a FTP server I'd get a list of all of the files in a directory, sort them by date descending, and put that into an array of strings in SSIS. From there I can handle it without any problem.
Anyone have thoughts/suggestions?
The best scenario I can come up with right now is to pipe the output of a script which logs in and lists all files in the directory. I can then clean up the resulting file and import the filenames back into SSIS but it's not the cleanest method available.