Downloading all files from URL Folder to local folder
I want to download all contents from a url using Windows 10, like
https://example.com/folder
This folder may contain 100 files, and I want to put them all into
C:\backups
I am going to schedule this script to run once every hour, and I don't want it to download the files that already exists.
I've tried a few scripts, like:
bitsadmin.exe /transfer "test" https://website.com/folder/ C:\backups
Without any success, note: I have a very small experience with batch scripts
You will find a couple of useful PowerShell scripts in the post How to download all files from URL?
Here is one of the two:
$outputdir = 'D:\Downloads'
$url = 'https://opendata.dwd.de/climate_environment/CDC/observations_germany/climate/daily/kl/recent/'
# enable TLS 1.2 and TLS 1.1 protocols
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12, [Net.SecurityProtocolType]::Tls11
$WebResponse = Invoke-WebRequest -Uri $url
# get the list of links, skip the first one ("../") and download the files
$WebResponse.Links | Select-Object -ExpandProperty href -Skip 1 | ForEach-Object {
Write-Host "Downloading file '$_'"
$filePath = Join-Path -Path $outputdir -ChildPath $_
$fileUrl = '{0}/{1}' -f $url.TrimEnd('/'), $_
Invoke-WebRequest -Uri $fileUrl -OutFile $filePath
}