Chiming in to say I've noticed this same issue with my code that previously worked:
### REPRODUCIBLE CODE ####
collections = ['HLSL30_2.0', 'HLSS30_2.0']
start = '2021-01-01'
end = '2024-12-01'
bbox = [-105.55427843, 35.64105739, -105.31137177, 35.81262559]
from pystac_client import Client
STAC_URL = '
https://cmr.earthdata.nasa.gov/stac'
catalog = Client.open(f"{STAC_URL}/LPCLOUD")
catalog.add_conforms_to('COLLECTIONS')
search = catalog.search(collections = collections, bbox = bbox, datetime = start+'/'+end,)
print('Total matches for search:',search.matched())
In my case, the total matches is 390. Then when I try to retrieve the items via:
item_list = list(search.items())
It only returns a default of 20, up to about 150 if I tweak with the limit parameter. Otherwise it returns that same error if I try to go higher:
############ TRACEBACK ###############
---------------------------------------------------------------------------
APIError Traceback (most recent call last)
Cell In[40], line 1
----> 1 item_list = list(search.items())
File ~/envulmo2/lib/python3.12/site-packages/pystac_client/item_search.py:694, in ItemSearch.items(self)
687 def items(self) -> Iterator[Item]:
688 """Iterator that yields :class:`pystac.Item` instances for each item matching
689 the given search parameters.
690
691 Yields:
692 Item : each Item matching the search criteria
693 """
--> 694 for item in self.items_as_dicts():
695 # already signed in items_as_dicts
696 yield Item.from_dict(item, root=self.client, preserve_dict=False)
File ~/envulmo2/lib/python3.12/site-packages/pystac_client/item_search.py:705, in ItemSearch.items_as_dicts(self)
698 def items_as_dicts(self) -> Iterator[Dict[str, Any]]:
699 """Iterator that yields :class:`dict` instances for each item matching
700 the given search parameters.
701
702 Yields:
703 Item : each Item matching the search criteria
704 """
--> 705 for page in self.pages_as_dicts():
706 for item in page.get("features", []):
707 # already signed in pages_as_dicts
708 yield item
File ~/envulmo2/lib/python3.12/site-packages/pystac_client/item_search.py:737, in ItemSearch.pages_as_dicts(self)
735 if isinstance(self._stac_io, StacApiIO):
736 num_items = 0
--> 737 for page in self._stac_io.get_pages(
738 self.url, self.method, self.get_parameters()
739 ):
740 call_modifier(self.modifier, page)
741 features = page.get("features", [])
File ~/envulmo2/lib/python3.12/site-packages/pystac_client/stac_api_io.py:297, in StacApiIO.get_pages(self, url, method, parameters)
285 def get_pages(
286 self,
287 url: str,
288 method: Optional[str] = None,
289 parameters: Optional[Dict[str, Any]] = None,
290 ) -> Iterator[Dict[str, Any]]:
291 """Iterator that yields dictionaries for each page at a STAC paging
292 endpoint, e.g., /collections, /search
293
294 Return:
295 Dict[str, Any] : JSON content from a single page
296 """
--> 297 page = self.read_json(url, method=method, parameters=parameters)
298 if not (page.get("features") or page.get("collections")):
299 return None
File ~/envulmo2/lib/python3.12/site-packages/pystac/stac_io.py:205, in StacIO.read_json(self, source, *args, **kwargs)
188 def read_json(self, source: HREF, *args: Any, **kwargs: Any) -> dict[str, Any]:
189 """Read a dict from the given source.
190
191 See :func:`StacIO.read_text <pystac.StacIO.read_text>` for usage of
(...)
203 given source.
204 """
--> 205 txt = self.read_text(source, *args, **kwargs)
206 return self.json_loads(txt)
File ~/envulmo2/lib/python3.12/site-packages/pystac_client/stac_api_io.py:168, in StacApiIO.read_text(self, source, *args, **kwargs)
166 href = str(source)
167 if _is_url(href):
--> 168 return self.request(href, *args, **kwargs)
169 else:
170 with open(href) as f:
File ~/envulmo2/lib/python3.12/site-packages/pystac_client/stac_api_io.py:220, in StacApiIO.request(self, href, method, headers, parameters)
218 raise APIError(str(err))
219 if resp.status_code != 200:
--> 220 raise APIError.from_response(resp)
221 try:
222 return resp.content.decode("utf-8")
APIError: {"errors":["Oops! Something has gone wrong. We have been alerted and are working to resolve the problem. Please try your request again later."]}
Thanks to everyone for flagging this so I know I'm not alone!