Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package python-sat-search for 
openSUSE:Factory checked in at 2022-10-08 01:26:02
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-sat-search (Old)
 and      /work/SRC/openSUSE:Factory/.python-sat-search.new.2275 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "python-sat-search"

Sat Oct  8 01:26:02 2022 rev:2 rq:1008864 version:0.3.0

Changes:
--------
--- /work/SRC/openSUSE:Factory/python-sat-search/python-sat-search.changes      
2020-08-18 15:05:54.551913062 +0200
+++ 
/work/SRC/openSUSE:Factory/.python-sat-search.new.2275/python-sat-search.changes
    2022-10-08 01:26:29.618404237 +0200
@@ -1,0 +2,9 @@
+Fri Oct  7 15:21:21 UTC 2022 - Yogalakshmi Arunachalam <yarunacha...@suse.com>
+
+- Update to version 0.3.0 
+  * Updated to work with STAC API v0.9.0 and v1.0.0-beta.2
+  * SATUTILS_API_URL envvar changed to STAC_API_URL and default value removed. 
Specify with envvar or pass into Search when using library
+  * When downloading, specify filename_template for location instead of both 
datadir and filename.
+  * Update pagination to precisely follow STAC spec
+
+-------------------------------------------------------------------

Old:
----
  sat-search-0.2.3.tar.gz

New:
----
  sat-search-0.3.0.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ python-sat-search.spec ++++++
--- /var/tmp/diff_new_pack.pUNnyE/_old  2022-10-08 01:26:29.962405026 +0200
+++ /var/tmp/diff_new_pack.pUNnyE/_new  2022-10-08 01:26:29.966405034 +0200
@@ -1,7 +1,7 @@
 #
 # spec file for package python-sat-search
 #
-# Copyright (c) 2020 SUSE LLC
+# Copyright (c) 2022 SUSE LLC
 #
 # All modifications and additions to the file contributed by third parties
 # remain the property of their copyright owners, unless otherwise agreed
@@ -20,7 +20,7 @@
 %define skip_python2 1
 %{?!python_module:%define python_module() python-%{**} python3-%{**}}
 Name:           python-sat-search
-Version:        0.2.3
+Version:        0.3.0
 Release:        0
 Summary:        A tool for discovering and downloading publicly available 
satellite imagery
 License:        MIT
@@ -34,7 +34,7 @@
 BuildRequires:  python-rpm-macros
 Requires:       python-sat-stac
 Requires(post): update-alternatives
-Requires(postun): update-alternatives
+Requires(postun):update-alternatives
 BuildArch:      noarch
 %python_subpackages
 

++++++ sat-search-0.2.3.tar.gz -> sat-search-0.3.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/sat-search-0.2.3/PKG-INFO 
new/sat-search-0.3.0/PKG-INFO
--- old/sat-search-0.2.3/PKG-INFO       2020-06-25 17:38:22.000000000 +0200
+++ new/sat-search-0.3.0/PKG-INFO       2020-08-21 23:30:14.000000000 +0200
@@ -1,6 +1,6 @@
 Metadata-Version: 1.1
 Name: sat-search
-Version: 0.2.3
+Version: 0.3.0
 Summary: A python client for sat-api
 Home-page: https://github.com/sat-utils/sat-search
 Author: Matthew Hanson (matthewhanson)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/sat-search-0.2.3/README.md 
new/sat-search-0.3.0/README.md
--- old/sat-search-0.2.3/README.md      2020-06-24 18:57:40.000000000 +0200
+++ new/sat-search-0.3.0/README.md      2020-08-21 23:29:00.000000000 +0200
@@ -2,7 +2,15 @@
 
 
[![CircleCI](https://circleci.com/gh/sat-utils/sat-search.svg?style=svg&circle-token=a66861b5cbba7acd4abd7975f804ab061a365e1b)](https://circleci.com/gh/sat-utils/sat-search)
 
-Sat-search is a Python 3 library and a command line tool for discovering and 
downloading publicly available satellite imagery using a conformant API such as 
[sat-api](https://github.com/sat-utils/sat-api).
+Sat-search is a Python 3 library and a command line tool for discovering and 
downloading publicly available satellite imagery using STAC compliant API.
+
+## STAC APIs
+
+Starting with v0.3.0, sat-search does not have a default STAC endpoint. This 
can be passed as a parameter when using the library, or define the environment 
variable `STAC_API_URL`. Endpoints known to work are provided in this table:
+
+| Endpoint  | Data |
+| --------  | ----  |
+| https://earth-search.aws.element84.com/v0  | Sentinel-2 |
 
 
 ## Installation
@@ -34,6 +42,7 @@
 | --------   | ----  |
 | 0.1.x      | 0.5.x - 0.6.x |
 | 0.2.x      | 0.5.x - 0.7.x |
+| 0.3.x      | 0.9.x - 1.0.0-beta.2 |
 
 
 ## Using sat-search
@@ -51,7 +60,7 @@
 $ sat-search -h
 usage: sat-search [-h] {search,load} ...
 
-sat-search (v0.2.0)
+sat-search (v0.3.0)
 
 positional arguments:
   {search,load}
@@ -69,27 +78,31 @@
 ```
 $ sat-search search -h
 usage: sat-search search [-h] [--version] [-v VERBOSITY]
-                         [--print-md [PRINTMD [PRINTMD ...]]] [--print-cal]
-                         [--save SAVE] [-c COLLECTION] [--ids [IDS [IDS ...]]]
-                         [--bbox BBOX BBOX BBOX BBOX]
+                         [--print-md [PRINTMD [PRINTMD ...]]]
+                         [--print-cal PRINTCAL] [--save SAVE]
+                         [-c [COLLECTIONS [COLLECTIONS ...]]]
+                         [--ids [IDS [IDS ...]]] [--bbox BBOX BBOX BBOX BBOX]
                          [--intersects INTERSECTS] [--datetime DATETIME]
-                         [--sort [SORT [SORT ...]]] [--found]
-                         [-p [PROPERTY [PROPERTY ...]]] [--url URL]
+                         [-q [QUERY [QUERY ...]]]
+                         [--sortby [SORTBY [SORTBY ...]]] [--found]
+                         [--url URL] [--headers HEADERS] [--limit LIMIT]
 
 optional arguments:
   -h, --help            show this help message and exit
   --version             Print version and exit
   -v VERBOSITY, --verbosity VERBOSITY
-                        0:quiet, 1:error, 2:warning, 3:info, 4:debug (default: 
2)
+                        0:quiet, 1:error, 2:warning, 3:info, 4:debug (default:
+                        2)
 
 output options:
   --print-md [PRINTMD [PRINTMD ...]]
-                        Print specified metadata for matched scenes (default: 
None)
-  --print-cal           Print calendar showing dates (default: False)
+                        Print specified metadata for matched scenes (default:
+                        None)
+  --print-cal PRINTCAL  Print calendar showing dates (default: None)
   --save SAVE           Save results as GeoJSON (default: None)
 
 search options:
-  -c COLLECTION, --collection COLLECTION
+  -c [COLLECTIONS [COLLECTIONS ...]], --collections [COLLECTIONS [COLLECTIONS 
...]]
                         Name of collection (default: None)
   --ids [IDS [IDS ...]]
                         One or more scene IDs from provided collection
@@ -101,32 +114,36 @@
                         GeoJSON Feature (file or string) (default: None)
   --datetime DATETIME   Single date/time or begin and end date/time (e.g.,
                         2017-01-01/2017-02-15) (default: None)
-  --sort [SORT [SORT ...]]
+  -q [QUERY [QUERY ...]], --query [QUERY [QUERY ...]]
+                        Query properties of form KEY=VALUE (<, >, <=, >=, =
+                        supported) (default: None)
+  --sortby [SORTBY [SORTBY ...]]
                         Sort by fields (default: None)
   --found               Only output how many Items found (default: False)
-  -p [PROPERTY [PROPERTY ...]], --property [PROPERTY [PROPERTY ...]]
-                        Properties of form KEY=VALUE (<, >, <=, >=, =
-                        supported) (default: None)
-  --url URL             URL of the API (default: https://n34f767n91.execute-
-                        api.us-east-1.amazonaws.com/prod)
-```
+  --url URL             URL of the API (default: None)
+  --headers HEADERS     Additional request headers (JSON file or string)
+                        (default: None)
+  --limit LIMIT         Limits the total number of items returned (default:
+                        None)
 
 **Search options**
 
-- **collection** - Search only a specific collection. This is a shortcut, 
collection can also be provided as a property (e.g., `-p 
"collection=landsat-8-l1"`)
+- **collections** - Search only a specific collections. This is a shortcut, 
collection can also be provided as a query (e.g., `-q 
"collection=landsat-8-l1"`)
 - **ids** - Fetch the Item for the provided IDs in the given collection 
(collection must be provided). All other search options will be ignored.
 - **intersects** - Provide a GeoJSON Feature string or the name of a GeoJSON 
file containing a single Feature that is a Polygon of an AOI to be searched.
 - **datetime** - Provide a single partial or full datetime (e.g., 2017, 
2017-10, 2017-10-11, 2017-10-11T12:00), or two seperated by a slash that 
defines a range. e.g., 2017-01-01/2017-06-30 will search for scenes acquired in 
the first 6 months of 2017.
-- **property** - Allows searching for any other scene properties by providing 
the pair as a string (e.g. `-p "landsat:row=42"`, `-p "eo:cloud_cover<10"`). 
Supported symbols include: =, <, >, >=, and <=
-- **sort** - Sort by specific properties in ascending or descending order. A 
list of properties can be provided which will be used for sorting in that order 
of preference. By default a property will be sorted in descending order. To 
specify the order the property can be preceded with '<' (ascending) or '>' 
(descending). e.g., `--sort ">datetime" "<eo:cloud_cover" will sort by 
descending date, then by ascending cloud cover
-- **found** - This will print out the total number of scenes found, then exit 
without fetching the actual items. 
-- **url** - The URL endpoint of a STAC compliant API, this can also be set 
with the environment variable SATUTILS_API_URL
+- **query** - Allows searching for any other scene properties by providing the 
pair as a string (e.g. `-p "landsat:row=42"`, `-p "eo:cloud_cover<10"`). 
Supported symbols include: =, <, >, >=, and <=
+- **sortby** - Sort by specific properties in ascending or descending order. A 
list of properties can be provided which will be used for sorting in that order 
of preference. By default a property will be sorted in descending order. To 
specify the order the property can be preceded with '<' (ascending) or '>' 
(descending). e.g., `--sort ">datetime" "<eo:cloud_cover" will sort by 
descending date, then by ascending cloud cover
+- **found** - This will print out the total number of scenes found, then exit 
without fetching the actual items (i.e., the query is made with limit=0). 
+- **url** - The URL endpoint of a STAC compliant API, this can also be set 
with the environment variable STAC_API_URL
+- **headers** - Additional request headers useful for specifying 
authentication parameters
+- **limit** - Limits total number of Items returned
 
 **Output options**
 These options control what to do with the search results, multiple switches 
can be provided.
 
 - **print-md** - Prints a list of specific metadata fields for all the scenes. 
If given without any arguments it will print a list of the dates and scene IDs. 
Otherwise it will print a list of fields that are provided. (e.g., --print-md 
date eo:cloud_cover eo:platform will print a list of date, cloud cover, and the 
satellite platform such as WORLDVIEW03)
-- **print-cal** - Prints a text calendar (see iumage below) with specific days 
colored depending on the platform of the scene (e.g. landsat-8), along with a 
legend.
+- **print-cal** - Prints a text calendar (see image below) with specific days 
colored grouped by a provided property name (e.g. platform), along with a 
legend.
 - **save** - Saves results as a FeatureCollection. The FeatureCollection 
'properties' contains all of the arguments used in the search and the 
'features' contain all of the individual scenes, with individual scene metadata 
merged with collection level metadata (metadata fields that are the same across 
all one collection, such as eo:platform)
 
 ![](images/calendar.png)
@@ -137,11 +154,9 @@
 Scenes that were previously saved with `sat-search search --save ...` can be 
loaded with the `load` subcommand.
 
 ```
-$ sat-search load -h
-usage: sat-search load [-h] [--version] [-v VERBOSITY]
-                       [--print-md [PRINTMD [PRINTMD ...]]] [--print-cal]
-                       [--save SAVE] [--datadir DATADIR] [--filename FILENAME]
-                       [--download [DOWNLOAD [DOWNLOAD ...]]]
+usage: sat-search load [-h] [--version] [-v VERBOSITY] [--print-md [PRINTMD 
[PRINTMD ...]]] [--print-cal PRINTCAL]
+                       [--save SAVE] [--filename_template FILENAME_TEMPLATE]
+                       [--download [DOWNLOAD [DOWNLOAD ...]]] 
[--requester-pays]
                        items
 
 positional arguments:
@@ -151,23 +166,21 @@
   -h, --help            show this help message and exit
   --version             Print version and exit
   -v VERBOSITY, --verbosity VERBOSITY
-                        0:quiet, 1:error, 2:warning, 3:info, 4:debug (default:
-                        2)
+                        0:quiet, 1:error, 2:warning, 3:info, 4:debug (default: 
2)
 
 output options:
   --print-md [PRINTMD [PRINTMD ...]]
-                        Print specified metadata for matched scenes (default:
-                        None)
-  --print-cal           Print calendar showing dates (default: False)
+                        Print specified metadata for matched scenes (default: 
None)
+  --print-cal PRINTCAL  Print calendar showing dates (default: None)
   --save SAVE           Save results as GeoJSON (default: None)
 
 download options:
-  --datadir DATADIR     Directory pattern to save assets (default:
-                        ./${eo:platform}/${date})
-  --filename FILENAME   Save assets with this filename pattern based on
-                        metadata keys (default: ${id})
+  --filename_template FILENAME_TEMPLATE
+                        Save assets with this filename pattern based on 
metadata keys (default:
+                        ${collection}/${date}/${id})
   --download [DOWNLOAD [DOWNLOAD ...]]
                         Download assets (default: None)
+  --requester-pays      Acknowledge paying egress costs for downloads (if in 
request pays bucket) (default: False)
 ```
 
 Note that while the search options are gone, output options are still 
available and can be used with the search results loaded from the file. There 
is also a new series of options for downloading data.
@@ -177,18 +190,16 @@
 
 **Download options**
 These control the downloading of assets. Both datadir and filename can include 
metadata patterns that will be substituted per scene.
-- **datadir** - This specifies where downloaded assets will be saved to. It 
can also be specified by setting the environment variable SATUTILS_DATADIR.
-- **filename** - The name of the file to save. It can also be set by setting 
the environment variable SATUTILS_FILENAME
 - **download** - Provide a list of keys to download these assets. More 
information on downloading data is provided below.
-
-**Metadata patterns**
-Metadata patterns can be used in **datadir** and **filename** in order to have 
custom path and filenames based on the Item metadata. For instance specifying 
datadir as "./${eo:platform}/${date}" will save assets for each Item under 
directories of the platform and the date. So a landsat-8 Item from June 20, 
2018 will have it's assets saved in a directory './landsat-8/2017-06-20'. For 
filenames these work exactly the same way, except the filename will contain a 
suffix containing the asset key and the appropriate extension.
-
-```
-    sat-search load scenes.json --download thumbnail MTL
-```
-
-In this case the defaults for `datadir` ("./${eo:platform}/${date}") and 
`filename` ("${id}") are used so the download files are saved like this:
+- **filename_template** - This specifies the filename prefix where downloaded 
assets will be saved to based on a template using properties from the specific 
STAC Item. Supported fields:
+    - ${id}: The ID of the STAC Item
+    - ${collection}: The collection of the STAC Item
+    - ${date}: The date portion of the `datetime` property
+    - ${year}: The year of the `datetime` property
+    - ${month}: The month of the `datetime` property
+    - ${day}: The day of the month of the `datetime` property
+    - ${<property>}: Any STAC Item property may be used, e.g. 
"${eo:cloud_cover}", "${platform}
+  The actual filename will be this prefix followed by the asset key and an 
appropriate extension. For example, specifying `filename_template` as 
"./${eo:platform}/${date}/${id}" will save assets for each Item under 
directories of the platform and the date. Thus, a landsat-8 Item from June 20, 
2018 will have it's assets saved in a directory './landsat-8/2017-06-20/'. A 
metadata asset with the key `MTL` would be saved as 
'./landsat-8/2017-06-20/LC80090292018275LGN00_MTL.TIF'. The last component of 
the filename_template is taken as the filename. See example directory structure 
below.
 
 ```
 landsat-8/
@@ -209,4 +220,4 @@
 This [Jupyter notebook tutorial](tutorial-1.ipynb) covers all the main 
features of the library.
 
 ## About
-sat-search was created by [Development Seed](<http://developmentseed.org>) and 
is part of a collection of tools called 
[sat-utils](https://github.com/sat-utils).
+sat-search is part of a collection of tools called 
[sat-utils](https://github.com/sat-utils).
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/sat-search-0.2.3/requirements.txt 
new/sat-search-0.3.0/requirements.txt
--- old/sat-search-0.2.3/requirements.txt       2020-06-24 18:57:40.000000000 
+0200
+++ new/sat-search-0.3.0/requirements.txt       2020-08-21 23:29:00.000000000 
+0200
@@ -1 +1 @@
-sat-stac~=0.3.0
+sat-stac~=0.4.0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/sat-search-0.2.3/sat_search.egg-info/PKG-INFO 
new/sat-search-0.3.0/sat_search.egg-info/PKG-INFO
--- old/sat-search-0.2.3/sat_search.egg-info/PKG-INFO   2020-06-25 
17:38:22.000000000 +0200
+++ new/sat-search-0.3.0/sat_search.egg-info/PKG-INFO   2020-08-21 
23:30:14.000000000 +0200
@@ -1,6 +1,6 @@
 Metadata-Version: 1.1
 Name: sat-search
-Version: 0.2.3
+Version: 0.3.0
 Summary: A python client for sat-api
 Home-page: https://github.com/sat-utils/sat-search
 Author: Matthew Hanson (matthewhanson)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/sat-search-0.2.3/sat_search.egg-info/SOURCES.txt 
new/sat-search-0.3.0/sat_search.egg-info/SOURCES.txt
--- old/sat-search-0.2.3/sat_search.egg-info/SOURCES.txt        2020-06-25 
17:38:22.000000000 +0200
+++ new/sat-search-0.3.0/sat_search.egg-info/SOURCES.txt        2020-08-21 
23:30:14.000000000 +0200
@@ -10,7 +10,6 @@
 sat_search.egg-info/top_level.txt
 satsearch/__init__.py
 satsearch/cli.py
-satsearch/config.py
 satsearch/search.py
 satsearch/version.py
 test/__init__.py
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/sat-search-0.2.3/sat_search.egg-info/dependency_links.txt 
new/sat-search-0.3.0/sat_search.egg-info/dependency_links.txt
--- old/sat-search-0.2.3/sat_search.egg-info/dependency_links.txt       
2020-06-25 17:38:22.000000000 +0200
+++ new/sat-search-0.3.0/sat_search.egg-info/dependency_links.txt       
2020-08-21 23:30:14.000000000 +0200
@@ -1,2 +1,2 @@
-sat-stac~=0.3.0
+sat-stac~=0.4.0
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/sat-search-0.2.3/sat_search.egg-info/requires.txt 
new/sat-search-0.3.0/sat_search.egg-info/requires.txt
--- old/sat-search-0.2.3/sat_search.egg-info/requires.txt       2020-06-25 
17:38:22.000000000 +0200
+++ new/sat-search-0.3.0/sat_search.egg-info/requires.txt       2020-08-21 
23:30:14.000000000 +0200
@@ -1 +1 @@
-sat-stac~=0.3.0
+sat-stac~=0.4.0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/sat-search-0.2.3/satsearch/cli.py 
new/sat-search-0.3.0/satsearch/cli.py
--- old/sat-search-0.2.3/satsearch/cli.py       2020-06-24 18:57:40.000000000 
+0200
+++ new/sat-search-0.3.0/satsearch/cli.py       2020-08-21 23:29:00.000000000 
+0200
@@ -4,13 +4,13 @@
 import os
 import sys
 
-import satsearch.config as config
-
 from .version import __version__
 from satsearch import Search
 from satstac import ItemCollection
 from satstac.utils import dict_merge
 
+API_URL = os.getenv('STAC_API_URL', None)
+
 
 class SatUtilsParser(argparse.ArgumentParser):
 
@@ -26,19 +26,18 @@
 
         self.download_parser = argparse.ArgumentParser(add_help=False)
         self.download_group = 
self.download_parser.add_argument_group('download options')
-        self.download_group.add_argument('--datadir', help='Directory pattern 
to save assets', default=config.DATADIR)
-        self.download_group.add_argument('--filename', default=config.FILENAME,
+        self.download_group.add_argument('--filename_template', 
default='${collection}/${date}/${id}',
                            help='Save assets with this filename pattern based 
on metadata keys')
         self.download_group.add_argument('--download', help='Download assets', 
default=None, nargs='*')
-        h = 'Acknowledge paying egress costs for downloads (if in request pays 
bucket)'
-        self.download_group.add_argument('--requestor-pays', help=h, 
default=False, action='store_true', dest='requestor_pays')
+        h = 'Acknowledge paying egress costs for downloads (if in requester 
pays bucket on AWS)'
+        self.download_group.add_argument('--requester-pays', help=h, 
default=False, action='store_true', dest='requester_pays')
 
         self.output_parser = argparse.ArgumentParser(add_help=False)
         self.output_group = self.output_parser.add_argument_group('output 
options')
         h = 'Print specified metadata for matched scenes'
         self.output_group.add_argument('--print-md', help=h, default=None, 
nargs='*', dest='printmd')
         h = 'Print calendar showing dates'
-        self.output_group.add_argument('--print-cal', help=h, default=False, 
action='store_true', dest='printcal')
+        self.output_group.add_argument('--print-cal', help=h, dest='printcal')
         self.output_group.add_argument('--save', help='Save results as 
GeoJSON', default=None)
 
     def parse_args(self, *args, **kwargs):
@@ -55,19 +54,26 @@
         if 'verbosity' in args:
             logging.basicConfig(stream=sys.stdout, 
level=(50-args.pop('verbosity') * 10))
 
-        # set global configuration options
-        if 'url' in args:
-            config.API_URL = args.pop('url')
-        if 'datadir' in args:
-            config.DATADIR = args.pop('datadir')
-        if 'filename' in args:
-            config.FILENAME = args.pop('filename')
-
         # if a filename, read the GeoJSON file
         if 'intersects' in args:
             if os.path.exists(args['intersects']):
                 with open(args['intersects']) as f:
-                    args['intersects'] = json.loads(f.read())
+                    data = json.loads(f.read())
+                    if data['type'] == 'Feature':
+                        args['intersects'] = data['geometry']
+                    elif data['type'] == 'FeatureCollection':
+                        args['intersects'] = data['features'][0]['geometry']
+                    else:
+                        args['intersects'] = data
+
+        # If a filename, read the JSON file
+        if 'headers' in args:
+            if os.path.exists(args['headers']):
+                with open(args['headers']) as f:
+                    headers = json.loads(f.read())
+            else:
+                headers = json.loads(args['headers'])
+            args['headers'] = {k: str(v) for k,v in headers.items()}
 
         return args
 
@@ -81,17 +87,19 @@
         sparser = subparser.add_parser('search', help='Perform new search of 
items', parents=parents)
         """ Adds search arguments to a parser """
         parser.search_group = sparser.add_argument_group('search options')
-        parser.search_group.add_argument('-c', '--collection', help='Name of 
collection', default=None)
+        parser.search_group.add_argument('-c', '--collections', help='Name of 
collection', nargs='*')
         h = 'One or more scene IDs from provided collection (ignores other 
parameters)'
         parser.search_group.add_argument('--ids', help=h, nargs='*', 
default=None)
         parser.search_group.add_argument('--bbox', help='Bounding box (min 
lon, min lat, max lon, max lat)', nargs=4)
         parser.search_group.add_argument('--intersects', help='GeoJSON Feature 
(file or string)')
         parser.search_group.add_argument('--datetime', help='Single date/time 
or begin and end date/time (e.g., 2017-01-01/2017-02-15)')
-        parser.search_group.add_argument('-p', '--property', nargs='*', 
help='Properties of form KEY=VALUE (<, >, <=, >=, = supported)')
-        parser.search_group.add_argument('--sort', help='Sort by fields', 
nargs='*')
+        parser.search_group.add_argument('-q', '--query', nargs='*', 
help='Query properties of form KEY=VALUE (<, >, <=, >=, = supported)')
+        parser.search_group.add_argument('--sortby', help='Sort by fields', 
nargs='*')
         h = 'Only output how many Items found'
         parser.search_group.add_argument('--found', help=h, 
action='store_true', default=False)
-        parser.search_group.add_argument('--url', help='URL of the API', 
default=config.API_URL)
+        parser.search_group.add_argument('--url', help='URL of the API', 
default=API_URL)
+        parser.search_group.add_argument('--headers', help='Additional request 
headers (JSON file or string)', default=None)
+        parser.search_group.add_argument('--limit', help='Limits the total 
number of items returned', default=None)
 
         parents.append(parser.download_parser)
         lparser = subparser.add_parser('load', help='Load items from previous 
search', parents=parents)
@@ -106,21 +114,23 @@
                 setattr(namespace, n, {'eq': v})
 
 
-def main(items=None, printmd=None, printcal=False, found=False,
-         save=None, download=None, requestor_pays=False, **kwargs):
+def main(items=None, printmd=None, printcal=None,
+         found=False, filename_template='${collection}/${date}/${id}',
+         save=None, download=None, requester_pays=False, headers=None, 
**kwargs):
     """ Main function for performing a search """
     
     if items is None:
         ## if there are no items then perform a search
-        search = Search.search(**kwargs)
+        search = Search.search(headers=headers, **kwargs)
+        ## Commenting out found logic until functions correctly.
         if found:
-            num = search.found()
-            print('%s items found' % num)
-            return num
-        items = search.items()
+             num = search.found(headers=headers)
+             print('%s items found' % num)
+             return num
+        items = search.items(headers=headers)
     else:
         # otherwise, load a search from a file
-        items = ItemCollection.load(items)
+        items = ItemCollection.open(items)
 
     print('%s items found' % len(items))
 
@@ -130,7 +140,7 @@
 
     # print calendar
     if printcal:
-        print(items.calendar())
+        print(items.calendar(printcal))
 
     # save all metadata in JSON file
     if save is not None:
@@ -142,7 +152,7 @@
             # get complete set of assets
             download = set([k for i in items for k in i.assets])
         for key in download:
-            items.download(key=key, path=config.DATADIR, 
filename=config.FILENAME, requestor_pays=requestor_pays)
+            items.download(key=key, filename_template=filename_template, 
requester_pays=requester_pays)
 
     return items
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/sat-search-0.2.3/satsearch/config.py 
new/sat-search-0.3.0/satsearch/config.py
--- old/sat-search-0.2.3/satsearch/config.py    2020-06-25 17:33:58.000000000 
+0200
+++ new/sat-search-0.3.0/satsearch/config.py    1970-01-01 01:00:00.000000000 
+0100
@@ -1,10 +0,0 @@
-import os
-
-# API URL
-API_URL = os.getenv('SATUTILS_API_URL', 
'https://earth-search-legacy.aws.element84.com')
-
-# data directory to store downloaded imagery
-DATADIR = os.getenv('SATUTILS_DATADIR', '${collection}/${date}')
-
-# filename pattern for saving files
-FILENAME = os.getenv('SATUTILS_FILENAME', '${id}')
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/sat-search-0.2.3/satsearch/search.py 
new/sat-search-0.3.0/satsearch/search.py
--- old/sat-search-0.2.3/satsearch/search.py    2020-06-24 18:57:40.000000000 
+0200
+++ new/sat-search-0.3.0/satsearch/search.py    2020-08-21 23:29:00.000000000 
+0200
@@ -3,13 +3,10 @@
 import logging
 import requests
 
-import satsearch.config as config
-
 from satstac import Collection, Item, ItemCollection
 from satstac.utils import dict_merge
 from urllib.parse import urljoin
 
-
 logger = logging.getLogger(__name__)
 
 
@@ -22,127 +19,111 @@
     search_op_list = ['>=', '<=', '=', '>', '<']
     search_op_to_stac_op = {'>=': 'gte', '<=': 'lte', '=': 'eq', '>': 'gt', 
'<': 'lt'}
 
-    def __init__(self, **kwargs):
+    def __init__(self, url=os.getenv('STAC_API_URL', None), **kwargs):
         """ Initialize a Search object with parameters """
+        if url is None:
+            raise SatSearchError("URL not provided, pass into Search or define 
STAC_API_URL environment variable")
+        self.url = url.rstrip("/") + "/"
         self.kwargs = kwargs
-        for k in self.kwargs:
-            if k == 'datetime':
-                self.kwargs['time'] = self.kwargs['datetime']
-                del self.kwargs['datetime']
+        self.limit = int(self.kwargs['limit']) if 'limit' in self.kwargs else 
None
 
     @classmethod
-    def search(cls, **kwargs):
-        if 'collection' in kwargs:
-            q = 'collection=%s' % kwargs['collection']
-            if 'property' not in kwargs:
-                kwargs['property'] = []
-            kwargs['property'].append(q)
-            del kwargs['collection']
-        if 'property' in kwargs and isinstance(kwargs['property'], list):
+    def search(cls, headers=None, **kwargs):
+        if 'query' in kwargs and isinstance(kwargs['query'], list):
             queries = {}
-            for prop in kwargs['property']:
+            for q in kwargs['query']:
                 for s in Search.search_op_list:
-                    parts = prop.split(s)
+                    parts = q.split(s)
                     if len(parts) == 2:
                         queries = dict_merge(queries, {parts[0]: 
{Search.search_op_to_stac_op[s]: parts[1]}})
                         break
-            del kwargs['property']
             kwargs['query'] = queries
-        directions = {'>': 'desc', '<': 'asc'}
-        if 'sort' in kwargs and isinstance(kwargs['sort'], list):
+        directions = {'-': 'desc', '+': 'asc'}
+        if 'sortby' in kwargs and isinstance(kwargs['sortby'], list):
             sorts = []
-            for a in kwargs['sort']:
+            for a in kwargs['sortby']:
                 if a[0] not in directions:
-                    a = '>' + a
+                    a = '+' + a
                 sorts.append({
                     'field': a[1:],
                     'direction': directions[a[0]]
                 })
-            del kwargs['sort']
-            kwargs['sort'] = sorts
+            kwargs['sortby'] = sorts
         return Search(**kwargs)
 
-    def found(self):
+    def found(self, headers=None):
         """ Small query to determine total number of hits """
-        if 'ids' in self.kwargs:
-            cid = self.kwargs['query']['collection']['eq']
-            return len(self.items_by_id(self.kwargs['ids'], cid))
         kwargs = {
-            'page': 1,
             'limit': 0
         }
         kwargs.update(self.kwargs)
-        results = self.query(**kwargs)
-        return results['meta']['found']
+        url = urljoin(self.url, 'search')
+        
+        results = self.query(url=url, headers=headers, **kwargs)
+        # TODO - check for status_code
+        logger.debug(f"Found: {json.dumps(results)}")
+        found = 0
+        if 'context' in results:
+            found = results['context']['matched']
+        elif 'numberMatched' in results:
+            found = results['numberMatched']
+        return found
 
-    @classmethod
-    def query(cls, url=urljoin(config.API_URL, 'stac/search'), **kwargs):
+    def query(self, url=None, headers=None, **kwargs):
         """ Get request """
+        url = url or urljoin(self.url, 'search')
         logger.debug('Query URL: %s, Body: %s' % (url, json.dumps(kwargs)))
-        response = requests.post(url, data=json.dumps(kwargs))
+        response = requests.post(url, json=kwargs, headers=headers)
+        logger.debug(f"Response: {response.text}")
         # API error
         if response.status_code != 200:
             raise SatSearchError(response.text)
         return response.json()
 
-    @classmethod
-    def collection(cls, cid):
+    def collection(self, cid, headers=None):
         """ Get a Collection record """
-        url = urljoin(config.API_URL, 'collections/%s' % cid)
-        return Collection(cls.query(url=url))
+        url = urljoin(self.url, 'collections/%s' % cid)
+        return Collection(self.query(url=url, headers=headers))
 
-    @classmethod
-    def items_by_id(cls, ids, collection):
-        """ Return Items from collection with matching ids """
-        col = cls.collection(collection)
-        items = []
-        base_url = urljoin(config.API_URL, 'collections/%s/items/' % 
collection)
-        for id in ids:
-            try:
-                items.append(Item(cls.query(urljoin(base_url, id))))
-            except SatSearchError as err:
-                pass
-        return ItemCollection(items, collections=[col])
-
-    def items(self, limit=10000):
+    def items(self, limit=10000, page_limit=500, headers=None):
         """ Return all of the Items and Collections for this search """
-        _limit = 500
-        if 'ids' in self.kwargs:
-            col = self.kwargs.get('query', {}).get('collection', {}).get('eq', 
None)
-            if col is None:
-                raise SatSearchError('Collection required when searching by 
id')
-            return self.items_by_id(self.kwargs['ids'], col)
-
-        items = []
-        found = self.found()
+        found = self.found(headers=headers)
+        limit = self.limit or limit
         if found > limit:
             logger.warning('There are more items found (%s) than the limit 
(%s) provided.' % (found, limit))
-        maxitems = min(found, limit)
-        kwargs = {
-            'page': 1,
-            'limit': min(_limit, maxitems)
+
+        nextlink = {
+            'method': 'POST',
+            'href': urljoin(self.url, 'search'),
+            'headers': headers,
+            'body': self.kwargs,
+            'merge': False
         }
-        kwargs.update(self.kwargs)
-        while len(items) < maxitems:
-            items += [Item(i) for i in self.query(**kwargs)['features']]
-            kwargs['page'] += 1
+
+        items = []
+        while nextlink and len(items) < limit:
+            if nextlink.get('method', 'GET') == 'GET':
+                resp = self.query(url=nextlink['href'], headers=headers, 
**self.kwargs)
+            else:
+                _headers = nextlink.get('headers', {})
+                _body = nextlink.get('body', {})
+                _body.update({'limit': page_limit})
+                
+                if nextlink.get('merge', False):
+                    _headers.update(headers)
+                    _body.update(self.kwargs)
+                resp = self.query(url=nextlink['href'], headers=_headers, 
**_body)
+            items += [Item(i) for i in resp['features']]
+            links = [l for l in resp['links'] if l['rel'] == 'next']
+            nextlink = links[0] if len(links) == 1 else None
 
         # retrieve collections
         collections = []
-        for c in set([item.properties['collection'] for item in items if 
'collection' in item.properties]):
-            collections.append(self.collection(c))
-            #del collections[c]['links']
-
-        # merge collections into items
-        #_items = []
-        #for item in items:
-        #    import pdb; pdb.set_trace()
-        #    if 'collection' in item['properties']:
-        #        item = dict_merge(item, 
collections[item['properties']['collection']])
-        #    _items.append(Item(item))
-
-        search = {
-            'endpoint': config.API_URL,
-            'parameters': self.kwargs
-        }
-        return ItemCollection(items, collections=collections, search=search)
+        try:
+            for c in set([item._data['collection'] for item in items if 
'collection' in item._data]):
+                collections.append(self.collection(c, headers=headers))
+                #del collections[c]['links']
+        except:
+            pass
+        logger.debug(f"Found: {len(items)}")
+        return ItemCollection(items, collections=collections)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/sat-search-0.2.3/satsearch/version.py 
new/sat-search-0.3.0/satsearch/version.py
--- old/sat-search-0.2.3/satsearch/version.py   2020-06-25 17:36:43.000000000 
+0200
+++ new/sat-search-0.3.0/satsearch/version.py   2020-08-21 23:29:00.000000000 
+0200
@@ -1 +1 @@
-__version__ = '0.2.3'
+__version__ = '0.3.0'
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/sat-search-0.2.3/test/test_cli.py 
new/sat-search-0.3.0/test/test_cli.py
--- old/sat-search-0.2.3/test/test_cli.py       2020-06-24 18:57:40.000000000 
+0200
+++ new/sat-search-0.3.0/test/test_cli.py       2020-08-21 23:29:00.000000000 
+0200
@@ -4,22 +4,16 @@
 from unittest.mock import patch
 import json
 import shutil
-import satsearch.config as config
 
 from satsearch.cli import main, SatUtilsParser, cli
 
 
 testpath = os.path.dirname(__file__)
-config.DATADIR = testpath
 
 
 class Test(unittest.TestCase):
     """ Test main module """
 
-    num_scenes = 740
-
-    args = 'search --datetime 2017-01-01 -p eo:cloud_cover=0/20 
eo:platform=landsat-8'
-
     @classmethod
     def get_test_parser(cls):
         """ Get testing parser with search and load subcommands """
@@ -37,12 +31,12 @@
         parser = self.get_test_parser()
         args = parser.parse_args(['search'])
         self.assertEqual(len(args), 3)
-        self.assertFalse(args['printcal'])
+        self.assertFalse(args['found'])
 
     def test_parse_args(self):
         """ Parse arguments """
         parser = self.get_test_parser()
-        args = self.args.split(' ')
+        args = 'search --datetime 2017-01-01 -q eo:cloud_cover<10 
platform=sentinel-2a'.split(' ')
         
         args = parser.parse_args(args)
         self.assertEqual(len(args), 5)
@@ -56,17 +50,17 @@
     def _test_parse_args_badcloud(self):
         parser = self.get_test_parser()
         with self.assertRaises(ValueError):
-            args = parser.parse_args('search --datetime 2017-01-01 --cloud 0.5 
eo:platform Landsat-8'.split(' '))
+            args = parser.parse_args('search --datetime 2017-01-01 -q 
platform=sentinel-2a'.split(' '))
 
     def test_main(self):
         """ Run main function """
-        items = main(datetime='2019-07-01', **{'collection': 'landsat-8-l1'})
-        self.assertEqual(len(items), self.num_scenes)
+        items = main(datetime='2020-01-01', collections=['sentinel-s2-l1c'], 
query=['eo:cloud_cover=0', 'data_coverage>80'])
+        self.assertEqual(len(items), 207)
 
     def test_main_found(self):
         """ Run main function """
-        found = main(datetime='2019-07-01', found=True, **{'collection': 
'landsat-8-l1'})
-        self.assertEqual(found, self.num_scenes)
+        found = main(datetime='2020-01-01', found=True)
+        self.assertEqual(found, 17819)
 
     def test_main_load(self):
         items = main(items=os.path.join(testpath, 'scenes.geojson'))
@@ -75,33 +69,32 @@
     def test_main_options(self):
         """ Test main program with output options """
         fname = os.path.join(testpath, 'test_main-save.json')
-        items = main(datetime='2019-07-01', save=fname, printcal=True, 
printmd=[], property=['eo:platform=landsat-8'])
-        self.assertEqual(len(items), self.num_scenes)
+        items = main(datetime='2020-01-01', save=fname, printcal=True, 
printmd=[],
+                     collections=['sentinel-s2-l2a'], 
query=['eo:cloud_cover=0', 'data_coverage>80'])
+        self.assertEqual(len(items), 212)
         self.assertTrue(os.path.exists(fname))
         os.remove(fname)
         self.assertFalse(os.path.exists(fname))
 
     def test_cli(self):
         """ Run CLI program """
-        with patch.object(sys, 'argv', 'sat-search search --datetime 
2017-01-01 --found -p eo:platform=landsat-8'.split(' ')):
+        with patch.object(sys, 'argv', 'sat-search search --datetime 
2017-01-01 --found -q platform=sentinel-2b'.split(' ')):
             cli()
 
     def test_cli_intersects(self):
-        cmd = 'sat-search search --intersects %s -p eo:platform=landsat-8 
--found' % os.path.join(testpath, 'aoi1.geojson')
+        cmd = 'sat-search search --intersects %s -q platform=sentinel-2b 
--found' % os.path.join(testpath, 'aoi1.geojson')
         with patch.object(sys, 'argv', cmd.split(' ')):
             cli()        
 
     def test_main_download(self):
         """ Test main program with downloading """
         with open(os.path.join(testpath, 'aoi1.geojson')) as f:
-            aoi = json.dumps(json.load(f))
-        config.DATADIR = os.path.join(testpath, "${eo:platform}")
-        items = main(datetime='2019-06-05/2019-06-21', intersects=aoi, 
download=['thumbnail', 'MTL'], **{'collection': 'landsat-8-l1'})
+            aoi = json.load(f)
+        filename_template = os.path.join(testpath, 
"test-download/${platform}/${id}")
+        items = main(datetime='2020-06-07', intersects=aoi['geometry'],
+                     filename_template=filename_template, 
download=['thumbnail', 'info'], **{'collections': ['sentinel-s2-l1c']})
         for item in items:
-            bname = os.path.splitext(item.get_filename(config.DATADIR))[0]
+            bname = os.path.splitext(item.get_path(filename_template))[0]
             assert(os.path.exists(bname + '_thumbnail.jpg'))
-            if not os.path.exists(bname + '_MTL.txt'):
-                import pdb; pdb.set_trace()
-            assert(os.path.exists(bname + '_MTL.txt'))
-        shutil.rmtree(os.path.join(testpath,'landsat-8'))
-        config.DATADIR = testpath
+            assert(os.path.exists(bname + '_info.json'))
+        #shutil.rmtree(os.path.join(testpath,'landsat-8'))
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/sat-search-0.2.3/test/test_search.py 
new/sat-search-0.3.0/test/test_search.py
--- old/sat-search-0.2.3/test/test_search.py    2020-06-24 18:57:40.000000000 
+0200
+++ new/sat-search-0.3.0/test/test_search.py    2020-08-21 23:29:00.000000000 
+0200
@@ -3,11 +3,11 @@
 import json
 import unittest
 
-import satsearch.config as config
-
 from satstac import Item
 from satsearch.search import SatSearchError, Search
 
+API_URL = 'https://earth-search.aws.element84.com/v0'
+
 
 class Test(unittest.TestCase):
 
@@ -23,7 +23,7 @@
 
     def get_searches(self):
         """ Initialize and return search object """
-        return [Search(datetime=r['properties']['datetime']) for r in 
self.results]
+        return [Search(datetime=r['properties']['datetime'], url=API_URL) for 
r in self.results]
 
     def test_search_init(self):
         """ Initialize a search object """
@@ -31,11 +31,11 @@
         dts = [r['properties']['datetime'] for r in self.results]
         
         assert(len(search.kwargs) == 1)
-        assert('time' in search.kwargs)
+        assert('datetime' in search.kwargs)
         for kw in search.kwargs:
             self.assertTrue(search.kwargs[kw] in dts)
 
-    def test_search_for_items_by_date(self):
+    def _test_search_for_items_by_date(self):
         """ Search for specific item """
         search = self.get_searches()[0]
         sids = [r['id'] for r in self.results]
@@ -52,47 +52,31 @@
     def test_geo_search(self):
         """ Perform simple query """
         with open(os.path.join(self.path, 'aoi1.geojson')) as f:
-            aoi = json.dumps(json.load(f))
-        search = Search(datetime='2019-07-01', intersects=aoi)
-        assert(search.found() == 13)
+            aoi = json.load(f)
+        search = Search(datetime='2020-06-07', intersects=aoi['geometry'])
+        assert(search.found() == 12)
         items = search.items()
-        assert(len(items) == 13)
+        assert(len(items) == 12)
         assert(isinstance(items[0], Item))
 
     def test_search_sort(self):
         """ Perform search with sort """
         with open(os.path.join(self.path, 'aoi1.geojson')) as f:
-            aoi = json.dumps(json.load(f))
-        search = Search.search(datetime='2019-07-01/2019-07-07', 
intersects=aoi, sort=['<datetime'])
+            aoi = json.load(f)
+        search = Search.search(datetime='2020-06-07', 
intersects=aoi['geometry'], sortby=['-properties.datetime'])
         items = search.items()
-        assert(len(items) == 27)
-
-    def test_get_items_by_id(self):
-        """ Get Items by ID """
-        ids = ['LC81692212019263', 'LC81691102019263']
-        items = Search.items_by_id(ids, collection='landsat-8-l1')
-        assert(len(items) == 2)
+        assert(len(items) == 12)
 
     def test_get_ids_search(self):
         """ Get Items by ID through normal search """
-        ids = ['LC81692212019263', 'LC81691102019263']
-        search = Search.search(ids=ids, collection='landsat-8-l1')
+        ids = ['S2A_28QBH_20200611_0_L2A', 'S2A_28QCH_20200611_0_L2A']
+        search = Search.search(ids=ids)
         items = search.items()
-        assert(search.found() == 2)
-        assert(len(items) == 2)
+        assert(search.found() == 4)
+        assert(len(items) == 4)
 
-    def test_get_ids_without_collection(self):
-        with self.assertRaises(SatSearchError):
-            search = Search.search(ids=['LC80340332018034LGN00'])
-            items = search.items()
-
-    def test_query_bad_url(self):
-        with self.assertRaises(SatSearchError):
-            Search.query(url=os.path.join(config.API_URL, 
'collections/nosuchcollection'))
-
-    def test_search_property_operator(self):
-        expected = {'query': {'eo:cloud_cover': {'lte': '10'}, 'collection': 
{'eq': 'sentinel-2-l1c'}}}
-        instance = Search.search(collection='sentinel-2-l1c',
-                                 property=['eo:cloud_cover<=10'])
-        actual = instance.kwargs
-        assert actual == expected
+    def test_search_query_operator(self):
+        expected = {'collections': ['sentinel-s2-l1c'], 'query': 
{'eo:cloud_cover': {'lte': '10'}, 'data_coverage': {'gt': '80'}}}
+        instance = Search.search(collections=['sentinel-s2-l1c'],
+                                 query=['eo:cloud_cover<=10', 
'data_coverage>80'])
+        assert instance.kwargs == expected

Reply via email to