Data download: SSL_CERTIFICATE_VERIFY_FAILED

Use this Forum to find information on, or ask a question about, NASA Earth Science data.
Post Reply
odrukker_usgs
Posts: 3
Joined: Tue Feb 18, 2025 12:39 pm America/New_York
Answers: 0

Data download: SSL_CERTIFICATE_VERIFY_FAILED

by odrukker_usgs » Tue Feb 18, 2025 12:49 pm America/New_York

Hi! I'm having trouble downloading Sentinel 1 data with asf_search. I have tried using both a .netrc file and the manual method providing my Earthdata username/password in the script. I keep getting an error similar to the chunk posted below. I'm using AWS, and my coworker also got the same error when they tried to run the script. Any insight would be greatly appreciated. Thanks!

Error processing S1A_IW_GRDH_1SDV_20180813T130916_20180813T130945_023228_028653_3B93-GRD_HD: HTTPSConnectionPool(host='datapool.asf.alaska.edu', port=443): Max retries exceeded with url: /GRD_HD/SA/S1A_IW_GRDH_1SDV_20180813T130916_20180813T130945_023228_028653_3B93.zip (Caused by SSLError(SSLCertVerificationError(1, "[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: Hostname mismatch, certificate is not valid for 'datapool.asf.alaska.edu'. (_ssl.c:1020)")))

Filters:

ASFx - bhauer
Posts: 52
Joined: Tue Dec 03, 2019 3:56 pm America/New_York
Answers: 0

Re: Data download: SSL_CERTIFICATE_VERIFY_FAILED

by ASFx - bhauer » Thu Feb 20, 2025 8:37 pm America/New_York

This may be just a matter of updating your SSL certificates. Try updating the certifi package which contains the trusted certificates bundle by using this command:

Code: Select all

pip install --upgrade certifi
Please let us know if this doesn't help.
Bill Hauer
Alaska Satellite Facility DAAC
User Support Office
uso@asf.alaska.edu

odrukker_usgs
Posts: 3
Joined: Tue Feb 18, 2025 12:39 pm America/New_York
Answers: 0

Re: Data download: SSL_CERTIFICATE_VERIFY_FAILED

by odrukker_usgs » Fri Feb 28, 2025 1:14 pm America/New_York

ASFx - bhauer wrote:
> This may be just a matter of updating your SSL certificates. Try updating
> the [i]certifi [/i]package which contains the trusted certificates bundle
> by using this command: [code]pip install --upgrade certifi[/code]
>
> Please let us know if this doesn't help.

I've tried updating certifi and haven't had any luck. Thanks.

ASF - bhauer
User Services
User Services
Posts: 15
Joined: Thu Dec 12, 2024 5:54 pm America/New_York
Answers: 0

Re: Data download: SSL_CERTIFICATE_VERIFY_FAILED

by ASF - bhauer » Fri Feb 28, 2025 7:19 pm America/New_York

Someone from our Discovery team is trying to recreate the error, so I hope to have a solution for you by early next week. Thanks for your patience!
Bill Hauer
Alaska Satellite Facility DAAC
User Support Office
uso@asf.alaska.edu

kimfairbanks
Posts: 5
Joined: Thu Oct 06, 2022 8:29 pm America/New_York
Answers: 1

Re: Data download: SSL_CERTIFICATE_VERIFY_FAILED

by kimfairbanks » Mon Mar 03, 2025 3:01 pm America/New_York

Hi, I was unable to recreate this download issue locally or with a fresh ec2 instance. Is there a chance you could provide the failing script?
Also, can you provide the output of the following python snippet from your aws service?

import asf_search as asf
session = asf.ASFSession()
sesssion.headers['user-agent']

(The output should look something like "Python/3.x.x; requests/x.x.x; asf_search/x.x.x")

odrukker_usgs
Posts: 3
Joined: Tue Feb 18, 2025 12:39 pm America/New_York
Answers: 0

Re: Data download: SSL_CERTIFICATE_VERIFY_FAILED

by odrukker_usgs » Fri Mar 07, 2025 2:37 pm America/New_York

kimfairbanks wrote:
> Hi, I was unable to recreate this download issue locally or with a fresh
> ec2 instance. Is there a chance you could provide the failing script?
> Also, can you provide the output of the following python snippet from your
> aws service?
>
> import asf_search as asf
> session = asf.ASFSession()
> sesssion.headers['user-agent']
>
> (The output should look something like "Python/3.x.x; requests/x.x.x;
> asf_search/x.x.x")

Here is my output to session.headers['user-agent']: Python/3.13.0; requests/2.32.3; asf_search/8.1.1

Below is the script so far. Thank you for your help!

import asf_search as asf
import pandas as pd
import numpy as np
import os
import sys
import geopandas as gpd
import rasterio
from rasterio.mask import mask
import datetime
import shutil
import zipfile
import logging

# Todo: Add logging, use for all hucs

def main():
'''This script downloads Sentinel-1 median VH summer data from ASF Vertex'''

# DATA PREP
# Upload HUC4 shapefiles
huc4_paths = [
r'/caldera/prosper_bayes/data/huc4/WBDHU4_H_1401.shp', # TEST
] # Todo: Loop with all hucs

# Define output file structure
temp_directory = r'/caldera/prosper_bayes/data/temp_dir'
os.makedirs(temp_directory, exist_ok=True)
output_directory = r'/caldera/prosper_bayes/data/sentinel1'
os.makedirs(output_directory, exist_ok=True)

# Login info
session = asf.ASFSession().auth_with_creds('username', 'password')

for huc4_path in huc4_paths:
# Load HUC4 shapefile
myhuc = gpd.read_file(huc4_path)
huc_code = huc4_path.split('_')[3]

# Get the bounding box and format for asf_search - can't accept complex polygons
bbox = myhuc.total_bounds # [minx, miny, maxx, maxy]
minx, miny, maxx, maxy = bbox
polygon = f'POLYGON(({minx} {miny}, {maxx} {miny}, {maxx} {maxy}, {minx} {maxy}, {minx} {miny}))'

# Define output file structure
output_filename = f'VHJulyAug2015to2022median{huc_code}.tif'
output_median_raster = os.path.join(output_directory, output_filename)

# Check if final output file already exists
if os.path.exists(output_median_raster):
print(f'Output file already exists: {output_median_raster}. Skipping download and processing.')
continue # Skip to next HUC if exists

# DEFINE ASF SEARCH PARAMETERS
search_results = asf.geo_search(
platform=[asf.PLATFORM.SENTINEL1],
intersectsWith=polygon,
polarization='VV+VH',
beamMode='IW',
processingLevel='GRD_HD',
start='July 1, 2015',
end='August 31, 2022',
season=[182, 243], # Summer filter - Jul-Aug
maxResults=3 # Todo: Test with 3, use all later
)

print(f'Found {len(search_results)} total Sentinel1 scenes for HUC {huc_code}')
if not search_results:
print('No valid Sentinel1 images found for given parameters')
continue # Skip to the next HUC

# Download results
for result in search_results:
try:
asf.download(result, temp_directory)
print(f'Downloaded: {result}')
except Exception as e:
print(f'Error downloading {result}: {e}')

# COMPUTE MEDIAN RASTER
def compute_median_raster(output_median_raster, myhuc):
'''Compute the median raster from downloaded Sentinel-1 images'''

# Initialize raster stack
stacked_arrays = []

# List downloaded files
downloaded_files = os.listdir(temp_directory)
print(f'Files in temp_directory: {downloaded_files}')

for file_name in downloaded_files:
if file_name.endswith('.zip'):
zip_file_path = os.path.join(temp_directory, file_name)

# Extract ZIP file
with zipfile.ZipFile(zip_file_path, 'r') as zip_ref:
zip_ref.extractall(temp_directory)

# Find the .tif file
extracted_files = os.listdir(temp_directory)
print(f'Extracted files: {extracted_files}')
temp_tif_path = next(
(os.path.join(temp_directory, f) for f in extracted_files if f.endswith('.tif')),
None
)

if temp_tif_path is None:
print(f'No .tif file found in extracted contents for {file_name}. Skipping...')
continue

# Open and mask raster
with rasterio.open(temp_tif_path) as src:
out_image, out_transform = mask(src, myhuc.geometry, crop=True)

if out_image.size > 0:
stacked_arrays.append(out_image)

# Compute median raster if valid images exist
if stacked_arrays:
median_array = np.ma.median(np.ma.array(stacked_arrays), axis=0).filled(0)

# Save the median raster
with rasterio.open(temp_tif_path) as src:
meta = src.meta.copy()
meta.update(dtype=rasterio.float32, count=1)

with rasterio.open(output_median_raster, 'w', **meta) as dst:
dst.write(median_array.astype(rasterio.float32), 1)

print(f'Saved median raster: {output_median_raster}')
else:
print('No valid images to compute median.')

# Compute the median raster
compute_median_raster(output_median_raster, myhuc)

# Clean up temporary directory
shutil.rmtree(temp_directory, ignore_errors=True)

if __name__ == '__main__':
sys.exit(main())

kimfairbanks
Posts: 5
Joined: Thu Oct 06, 2022 8:29 pm America/New_York
Answers: 1

Re: Data download: SSL_CERTIFICATE_VERIFY_FAILED

by kimfairbanks » Wed Mar 19, 2025 6:44 pm America/New_York

Hi again, thanks for the example, I made a simplified version that should cover the asf-search portion of it, I was unable to recreate it with the same version of python, requests, and asf-search installed. Would you be willing to try this base download case?

If this doesn't work, is there any information on the aws service you're using and its environment (i.e. OS, architecture, whether it's been updated recently)?

import asf_search as asf
import os
from getpass import getpass

temp_directory = './test_dir'
os.makedirs(temp_directory, exist_ok=True)

session = asf.ASFSession().auth_with_creds(input('username'), getpass('edl password'))
# DEFINE ASF SEARCH PARAMETERS
search_results = asf.geo_search(
platform=[asf.PLATFORM.SENTINEL1],
intersectsWith='POINT(-106.403 40.646)',
polarization='VV+VH',
beamMode='IW',
processingLevel='GRD_HD',
start='July 1, 2015',
end='August 31, 2022',
season=(182, 243), # Summer filter - Jul-Aug
maxResults=3 # Todo: Test with 3, use all later
)
for result in search_results:
try:
result.download(temp_directory, session=session)
print(f'Downloaded: {result}')
except Exception as e:
print(f'Error downloading {result}: {e}')

Post Reply