I think the USGS https://dds.cr.usgs.gov/srtm/ is no longer delivering data non-authenticated HTTP requests.
Every request seems to return a 404 not found.
Example: https://dds.cr.usgs.gov/srtm/version2_1/SRTM1//Region_01/N40W114.hgt.zip returns a 404.
Looks like the tests are failing because of this. Granted, am just running go test and didn't really debug further.
--- FAIL: TestGetElevation (0.00s) srtm_test.go:51: Invalid elevation for (45.277500, 13.726111): NaN, but should be 246.000000 srtm_test.go:51: Invalid elevation for (-26.400000, 146.250000): NaN, but should be 301.000000 srtm_test.go:51: Invalid elevation for (-12.100000, -77.016667): NaN, but should be 133.000000 srtm_test.go:51: Invalid elevation for (40.750000, -111.883333): NaN, but should be 1298.000000
My current solution is to download the data from https://search.earthdata.nasa.gov/ (NASA Shuttle Radar Topography Mission Global 3 arc second V003) and putting it into the cache directory with the filename that the go-elevations library is looking for. The earthdata does require a free account to access the downloads.
Earthdata produces a script that downloads each file individually.
Here's an example of one of the new download links.
https://e4ftl01.cr.usgs.gov//DP133/SRTM/SRTMGL3.003/2000.02.11/S34W059.SRTMGL3.hgt.zip
The script itself is just a bash script that uses curl to retrieve the data. It's authentication is archieved via a netrc file, so just basic authentication it looks like.
Here is some highlights from the script:
setup_auth_curl() {
# Firstly, check if it require URS authentication
status=$(curl -s -z "$(date)" -w %{http_code} https://e4ftl01.cr.usgs.gov//DP133/SRTM/SRTMGL3.003/2000.02.11/S34W059.SRTMGL3.hgt.zip | tail -1)
if [[ "$status" -ne "200" && "$status" -ne "304" ]]; then
# URS authentication is required. Now further check if the application/remote service is approved.
detect_app_approval
fi
fetch_urls() {
if command -v curl >/dev/null 2>&1; then
setup_auth_curl
while read -r line; do
# Get everything after the last '/'
filename="${line##*/}"
# Strip everything after '?'
stripped_query_params="${filename%%\?*}"
curl -f -b "$cookiejar" -c "$cookiejar" -L --netrc-file "$netrc" -g -o $stripped_query_params -- $line && echo || exit_with_error "Command failed with error. Please retrieve the data manually."
done;
fi
}
So in theory, I guess the library could be updated to use a new urls.json, and with a username + password configured in the client creation helper? But I'm unsure of this host being active all the time: e4ftl01.cr.usgs.gov
Anyway, just posting my findings. Hopefully what I wrote makes sense! Definitely a bummer if they did take down the SRTM service, the library has worked great thus far!