Page 1 of 1

Listing files from an Earthdata Cloud collection

Posted: Thu Jan 30, 2025 6:34 pm America/New_York
by sethdd
I'm currently working on a project that requires listing files from an Earthdata AWS Cloud bucket. I'm using rclone to connect to the bucket for a given collection and mounting it on my system as a folder. I want to note that this issue is not specific to rclone but any tool, like AWS CLI, to perform this function.

The issue I'm seeing is the LP DAAC does not allow to list resources i.e. files, directories, for a bucket. It seems to be an AWS identity-based policy problem (s3:ListBucket action) based on the error I'm seeing from rclone:

2025/01/30 16:53:33 ERROR : /: Dir.Stat error: operation error S3: ListObjectsV2, https response error StatusCode: 403, RequestID: HTFSAJ4K70Z7BS1S, HostID: IgeSW+qheqdzD4+a6kdVXZw+RK051DbbBorrpnSxkIWWyh3so/oTQ1gyKWrvukGVe8iCYd6PXgjJ1ZRzaYGcyg==, api error AccessDenied: User: arn:aws:sts::643705676985:assumed-role/s3-same-region-access-role/<EarthData USERNAME> is not authorized to perform: s3:ListBucket on resource: "arn:aws:s3:::lp-prod-public" because no identity-based policy allows the s3:ListBucket action

I don't get the above error when I mount a collection from the ORNL DAAC. I successfully mount the folder, navigate through a collection via file explorer, and open/access files.

------------------------------------------------------
Running rclone commands to access content
------------------------------------------------------
For ease, you can get the same error using the ls command

1. Create a profile in the rclone.conf file for lp-daac and ornl-daac. Here's how mine looks:

[earthdata-lp-s3]
type = s3
provider = AWS
region = us-west-2
access_key_id =
secret_access_key =
session_token =

2. Get the temporary Earthdata token credentials:
- https://data.lpdaac.earthdatacloud.nasa.gov/s3credentialsREADME
- https://data.ornldaac.earthdata.nasa.gov/s3credentialsREADME

3. Update each daac profile in rclone.conf file with tokens that were just generated

4. Run `rclone ls` command

rclone ls earthdata-lp-s3:lp-prod-protected/GEDI01_B.002/GEDI01_B_2023075201011_O24115_04_T08796_02_005_02_V002 -vv
rclone ls earthdata-ornl-s3:ornl-cumulus-prod-protected/gedi/GEDI_L3_LandSurface_Metrics_V2/data -vv

You should see something like this`is not authorized to perform: s3:ListBucket on resource: "arn:aws:s3:::lp-prod-protected" because no identity-based policy allows the s3:ListBucket action`

----------------------------------------
Good reference details:
----------------------------------------

Overview of bucket and user policies: https://docs.aws.amazon.com/AmazonS3/latest/userguide/access-policy-language-overview.html
ListObjectsV2 API request: https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListObjectsV2.html

See section "Access denied due to identity-based policies – implicit denial" as it relates to the error I'm seeing: https://docs.aws.amazon.com/AmazonS3/latest/userguide/troubleshoot-403-errors.html#access-denied-message-examples

rclone S3 configuration options: https://rclone.org/s3/#configuration

Re: Listing files from an Earthdata Cloud collection

Posted: Fri Jan 31, 2025 9:44 am America/New_York
by LP DAAC - dgolon
Hello @sethdd Our team is taking a look into your question. We will report back when we have additional details.

Re: Listing files from an Earthdata Cloud collection

Posted: Wed Feb 05, 2025 4:31 pm America/New_York
by sethdd
Excellent! I reached out yesterday, early evening, and hoping to hear back soon.

Re: Listing files from an Earthdata Cloud collection

Posted: Tue Feb 18, 2025 3:10 pm America/New_York
by LP DAAC - dgolon
Just closing the loop on this for other users. We were able to fix an issue we had in the LP DAAC bucket policy. Rclone should successfully run now.

Re: Listing files from an Earthdata Cloud collection

Posted: Tue Feb 18, 2025 3:15 pm America/New_York
by sethdd
LP DAAC - dgolon wrote:
> Just closing the loop on this for other users. We were able to fix an issue
> we had in the LP DAAC bucket policy. Rclone should successfully run now.

Thanks again for the support on resolving this!