I use a hosting company that produces backup files on a nightly basis. The backups are stored in subfolders containing the date and time of the backup, but on my local system I need the backup files always in the same location for further processing.
I've used WinSCP for years to download the files over WebDAV using a WinSCP command file like this:
"C:\Program Files (x86)\WinSCP\WinSCP.exe" /script="DownloadBackup.winscp"
The DownloadBackup.winscp file looks like this:
open https://user:pass@www.hosting.com/backup
# By omitting the trailing slash in the destination, the files are placed in the backup folder, instead of creating the backup_year_month_date_time folder
get -filemask="*>=24H" * -speed=10240 c:\data\backup
This construction always ensured the latest backup was available in the
c:\data\backup
folder.
I am now moving to a fully command line driven setup (without the batch files) and do the following now:
"c:\program files (x86)\winscp\winscp.com" /command "open https://user:pass@www.hosting.com/backup" "get -filemask=*>=24H * -speed=10240 c:\data\backup" "exit"
This gives me the message:
Are you sure you want to transfer multiple files to a single file 'backup' in a directory 'c:\data'?
The files will overwrite one another.
If you actually want to transfer all files to a directory 'c:\data\backup', keeping their name, make sure you terminate the path with a slash.
When I end the destination path with a slash, the folders from the hosting server are also copied, so then my backup files end up in the folder
c:\data\backup\2023-06-19_0300\
Is it possible to download the files from a remote subfolder into a given local folder using the command line option?