Hi
@evbogdan There are couple lines in the code that you provided that differ from the HLS tutorial (
https://github.com/nasa/HLS-Data-Resources/blob/main/python/tutorials/HLS_Tutorial.ipynb) and seem to be causing the problem.
The lines are:
```python
with rio.Env(**gdal_config): # Setting up GDAL environment
with rio.open(_i) as src:
```
We think if you revert that code block back to the original code, it should work.
Try below code-- this can also be found in the original tutorial if the formatting doesn't show up correctly on the forum:
```python
Use vsicurl to load the data directly into memory (be patient, may take a few seconds)
chunk_size = dict(band=1, x=512, y=512) # Tiles have 1 band and are divided into 512x512 pixel chunks
Sometimes a vsi curl error occurs so we need to retry if it does
max_retries = 10
for e in indices_bands_links:
print(e)
Try Loop
for _i in range(max_retries):
try:
Open and build datasets
if e.rsplit('.', 2)[-2] == indices_bands[0]: # NIR index
nir = rxr.open_rasterio(e, chunks=chunk_size, masked=True).squeeze('band', drop=True)
nir.attrs['scale_factor'] = 0.0001 # hard coded the scale_factor attribute
elif e.rsplit('.', 2)[-2] == indices_bands[1]: # red index
red = rxr.open_rasterio(e, chunks=chunk_size, masked=True).squeeze('band', drop=True)
red.attrs['scale_factor'] = 0.0001 # hard coded the scale_factor attribute
elif e.rsplit('.', 2)[-2] == indices_bands[2]: # blue index
blue = rxr.open_rasterio(e, chunks=chunk_size, masked=True).squeeze('band', drop=True)
blue.attrs['scale_factor'] = 0.0001 # hard coded the scale_factor attribute
break # Break out of the retry loop
except Exception as ex:
print(f"vsi curl error: {ex}
. Retrying...")
else:
print(f"Failed to process
{e}
after
{max_retries}
retries. Please check to see you're authenticated with earthaccess.")
print("The COGs have been loaded into memory!")
```
Thanks -- Danielle