Hello community,

here is the log from the commit of package python-elasticsearch for 
openSUSE:Factory checked in at 2017-10-20 14:47:05
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-elasticsearch (Old)
 and      /work/SRC/openSUSE:Factory/.python-elasticsearch.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "python-elasticsearch"

Fri Oct 20 14:47:05 2017 rev:2 rq:535242 version:5.4.0

Changes:
--------
--- 
/work/SRC/openSUSE:Factory/python-elasticsearch/python-elasticsearch.changes    
    2017-03-02 19:25:37.323085924 +0100
+++ 
/work/SRC/openSUSE:Factory/.python-elasticsearch.new/python-elasticsearch.changes
   2017-10-20 14:47:06.384775915 +0200
@@ -1,0 +2,7 @@
+Thu Oct 19 00:43:55 UTC 2017 - [email protected]
+
+- Implement single-spec version
+- Update to version 5.4.0
+  * see changelog at 
https://github.com/elastic/elasticsearch-py/blob/5.4.0/Changelog.rst
+
+-------------------------------------------------------------------

Old:
----
  elasticsearch-5.2.0.tar.gz

New:
----
  elasticsearch-5.4.0.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ python-elasticsearch.spec ++++++
--- /var/tmp/diff_new_pack.neRdh1/_old  2017-10-20 14:47:07.552721299 +0200
+++ /var/tmp/diff_new_pack.neRdh1/_new  2017-10-20 14:47:07.552721299 +0200
@@ -16,38 +16,56 @@
 #
 
 
+%{?!python_module:%define python_module() python-%{**} python3-%{**}}
+# Test files not included in source archive
+%bcond_with     test
 Name:           python-elasticsearch
-Version:        5.2.0
+Version:        5.4.0
 Release:        0
 Summary:        Python client for Elasticsearch
 License:        Apache-2.0
 Group:          Development/Languages/Python
-Url:            https://github.com/elasticsearch/elasticsearch-py
-Source:         
https://pypi.io/packages/source/e/elasticsearch/elasticsearch-%{version}.tar.gz
-BuildRequires:  python-devel
-BuildRequires:  python-setuptools
-Requires:       python-urllib3
-BuildRoot:      %{_tmppath}/%{name}-%{version}-build
-%if 0%{?suse_version} && 0%{?suse_version} <= 1110
-%{!?python_sitelib: %global python_sitelib %(python -c "from 
distutils.sysconfig import get_python_lib; print get_python_lib()")}
-%else
-BuildArch:      noarch
+Url:            https://github.com/elastic/elasticsearch-py
+Source:         
https://files.pythonhosted.org/packages/source/e/elasticsearch/elasticsearch-%{version}.tar.gz
+BuildRequires:  %{python_module devel}
+BuildRequires:  %{python_module setuptools}
+BuildRequires:  fdupes
+BuildRequires:  python-rpm-macros
+%if %{with test}
+BuildRequires:  %{python_module coverage}
+BuildRequires:  %{python_module mock}
+BuildRequires:  %{python_module nosexcover}
+BuildRequires:  %{python_module nose}
+BuildRequires:  %{python_module pyaml}
+BuildRequires:  %{python_module requests >= 2.0.0}
+BuildRequires:  %{python_module urllib3 >= 1.8}
 %endif
+Requires:       python-urllib3 >= 1.8
+BuildArch:      noarch
+
+%python_subpackages
 
 %description
-Official low-level client for Elasticsearch. It provides common
-code for all further Elasticsearch-related code in Python.
+Official low-level client for Elasticsearch. Its goal is to provide common
+ground for all Elasticsearch-related code in Python; because of this it tries
+to be opinion-free and very extendable.
 
 %prep
 %setup -q -n elasticsearch-%{version}
 
 %build
-python setup.py build
+%python_build
 
 %install
-python setup.py install --prefix=%{_prefix} --root=%{buildroot}
+%python_install
+%python_expand %fdupes %{buildroot}%{$python_sitelib}
+
+%if %{with test}
+%check
+%python_exec setup.py test
+%endif
 
-%files
+%files %{python_files}
 %defattr(-,root,root,-)
 %doc AUTHORS Changelog.rst LICENSE README README.rst
 %{python_sitelib}/*

++++++ elasticsearch-5.2.0.tar.gz -> elasticsearch-5.4.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/Changelog.rst 
new/elasticsearch-5.4.0/Changelog.rst
--- old/elasticsearch-5.2.0/Changelog.rst       2017-02-12 16:24:44.000000000 
+0100
+++ new/elasticsearch-5.4.0/Changelog.rst       2017-05-18 19:10:24.000000000 
+0200
@@ -3,6 +3,17 @@
 Changelog
 =========
 
+5.4.0 (2017-05-18)
+------------------
+
+ * ``bulk`` helpers now extract ``pipeline`` parameter from the action
+   dictionary.
+
+5.3.0 (2017-03-30)
+------------------
+
+Compatibility with elasticsearch 5.3
+
 5.2.0 (2017-02-12)
 ------------------
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/PKG-INFO 
new/elasticsearch-5.4.0/PKG-INFO
--- old/elasticsearch-5.2.0/PKG-INFO    2017-02-12 16:33:55.000000000 +0100
+++ new/elasticsearch-5.4.0/PKG-INFO    2017-05-18 19:12:04.000000000 +0200
@@ -1,6 +1,6 @@
 Metadata-Version: 1.1
 Name: elasticsearch
-Version: 5.2.0
+Version: 5.4.0
 Summary: Python client for Elasticsearch
 Home-page: https://github.com/elastic/elasticsearch-py
 Author: Honza Král
@@ -74,6 +74,26 @@
             pip install elasticsearch
         
         
+        Run Elasticsearch in a Container
+        --------------------------------
+        
+        To run elasticsearch in a container, optionally set the `ES_VERSION` 
environment evariable to either 5.4, 5.3 or 2.4. `ES_VERSION` is defaulted to 
`latest`.
+        Then run ./start_elasticsearch.sh::
+        
+            export ES_VERSION=5.4
+            ./start_elasticsearch.sh
+        
+        
+        This will run a version fo Elastic Search in a Docker container 
suitable for running the tests. To check that elasticearch is running
+        first wait for a `healthy` status in `docker ps`::
+        
+            $ docker ps
+            CONTAINER ID        IMAGE                      COMMAND             
     CREATED             STATUS                   PORTS                         
     NAMES
+            955e57564e53        7d2ad83f8446               
"/docker-entrypoin..."   6 minutes ago       Up 6 minutes (healthy)   
0.0.0.0:9200->9200/tcp, 9300/tcp   trusting_brattain
+        
+        Then you can navigate to `locahost:9200` in your browser.
+        
+        
         Example use
         -----------
         
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/README 
new/elasticsearch-5.4.0/README
--- old/elasticsearch-5.2.0/README      2017-01-03 15:55:15.000000000 +0100
+++ new/elasticsearch-5.4.0/README      2017-05-18 19:09:54.000000000 +0200
@@ -66,6 +66,26 @@
     pip install elasticsearch
 
 
+Run Elasticsearch in a Container
+--------------------------------
+
+To run elasticsearch in a container, optionally set the `ES_VERSION` 
environment evariable to either 5.4, 5.3 or 2.4. `ES_VERSION` is defaulted to 
`latest`.
+Then run ./start_elasticsearch.sh::
+
+    export ES_VERSION=5.4
+    ./start_elasticsearch.sh
+
+
+This will run a version fo Elastic Search in a Docker container suitable for 
running the tests. To check that elasticearch is running
+first wait for a `healthy` status in `docker ps`::
+
+    $ docker ps
+    CONTAINER ID        IMAGE                      COMMAND                  
CREATED             STATUS                   PORTS                              
NAMES
+    955e57564e53        7d2ad83f8446               "/docker-entrypoin..."   6 
minutes ago       Up 6 minutes (healthy)   0.0.0.0:9200->9200/tcp, 9300/tcp   
trusting_brattain
+
+Then you can navigate to `locahost:9200` in your browser.
+
+
 Example use
 -----------
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/README.rst 
new/elasticsearch-5.4.0/README.rst
--- old/elasticsearch-5.2.0/README.rst  2017-01-03 15:55:15.000000000 +0100
+++ new/elasticsearch-5.4.0/README.rst  2017-05-18 19:09:54.000000000 +0200
@@ -66,6 +66,26 @@
     pip install elasticsearch
 
 
+Run Elasticsearch in a Container
+--------------------------------
+
+To run elasticsearch in a container, optionally set the `ES_VERSION` 
environment evariable to either 5.4, 5.3 or 2.4. `ES_VERSION` is defaulted to 
`latest`.
+Then run ./start_elasticsearch.sh::
+
+    export ES_VERSION=5.4
+    ./start_elasticsearch.sh
+
+
+This will run a version fo Elastic Search in a Docker container suitable for 
running the tests. To check that elasticearch is running
+first wait for a `healthy` status in `docker ps`::
+
+    $ docker ps
+    CONTAINER ID        IMAGE                      COMMAND                  
CREATED             STATUS                   PORTS                              
NAMES
+    955e57564e53        7d2ad83f8446               "/docker-entrypoin..."   6 
minutes ago       Up 6 minutes (healthy)   0.0.0.0:9200->9200/tcp, 9300/tcp   
trusting_brattain
+
+Then you can navigate to `locahost:9200` in your browser.
+
+
 Example use
 -----------
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/docs/Changelog.rst 
new/elasticsearch-5.4.0/docs/Changelog.rst
--- old/elasticsearch-5.2.0/docs/Changelog.rst  2017-02-12 16:24:44.000000000 
+0100
+++ new/elasticsearch-5.4.0/docs/Changelog.rst  2017-05-18 19:10:24.000000000 
+0200
@@ -3,6 +3,17 @@
 Changelog
 =========
 
+5.4.0 (2017-05-18)
+------------------
+
+ * ``bulk`` helpers now extract ``pipeline`` parameter from the action
+   dictionary.
+
+5.3.0 (2017-03-30)
+------------------
+
+Compatibility with elasticsearch 5.3
+
 5.2.0 (2017-02-12)
 ------------------
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/docs/helpers.rst 
new/elasticsearch-5.4.0/docs/helpers.rst
--- old/elasticsearch-5.2.0/docs/helpers.rst    2015-10-13 22:21:10.000000000 
+0200
+++ new/elasticsearch-5.4.0/docs/helpers.rst    2017-05-04 11:22:21.000000000 
+0200
@@ -29,7 +29,7 @@
         '_type': 'document',
         '_id': 42,
         '_parent': 5,
-        '_ttl': '1d',
+        'pipeline': 'my-ingest-pipeline',
         '_source': {
             "title": "Hello World!",
             "body": "..."
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/elasticsearch/__init__.py 
new/elasticsearch-5.4.0/elasticsearch/__init__.py
--- old/elasticsearch-5.2.0/elasticsearch/__init__.py   2017-02-12 
16:24:59.000000000 +0100
+++ new/elasticsearch-5.4.0/elasticsearch/__init__.py   2017-05-18 
19:10:37.000000000 +0200
@@ -1,6 +1,6 @@
 from __future__ import absolute_import
 
-VERSION = (5, 2, 0)
+VERSION = (5, 4, 0)
 __version__ = VERSION
 __versionstr__ = '.'.join(map(str, VERSION))
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/elasticsearch/client/__init__.py 
new/elasticsearch-5.4.0/elasticsearch/client/__init__.py
--- old/elasticsearch-5.2.0/elasticsearch/client/__init__.py    2017-02-12 
16:23:49.000000000 +0100
+++ new/elasticsearch-5.4.0/elasticsearch/client/__init__.py    2017-05-15 
22:04:19.000000000 +0200
@@ -3,7 +3,7 @@
 
 from ..transport import Transport
 from ..exceptions import TransportError
-from ..compat import string_types, urlparse
+from ..compat import string_types, urlparse, unquote
 from .indices import IndicesClient
 from .ingest import IngestClient
 from .cluster import ClusterClient
@@ -44,12 +44,10 @@
             if parsed_url.scheme == "https":
                 h['port'] = parsed_url.port or 443
                 h['use_ssl'] = True
-                h['scheme'] = 'http'
-            elif parsed_url.scheme:
-                h['scheme'] = parsed_url.scheme
 
             if parsed_url.username or parsed_url.password:
-                h['http_auth'] = '%s:%s' % (parsed_url.username, 
parsed_url.password)
+                h['http_auth'] = '%s:%s' % (unquote(parsed_url.username),
+                                            unquote(parsed_url.password))
 
             if parsed_url.path and parsed_url.path != '/':
                 h['url_prefix'] = parsed_url.path
@@ -221,8 +219,8 @@
         """
         return self.transport.perform_request('GET', '/', params=params)
 
-    @query_params('parent', 'pipeline', 'refresh', 'routing',
-        'timeout', 'timestamp', 'ttl', 'version', 'version_type')
+    @query_params('parent', 'pipeline', 'refresh', 'routing', 'timeout',
+        'timestamp', 'ttl', 'version', 'version_type', 
'wait_for_active_shards')
     def create(self, index, doc_type, id, body, params=None):
         """
         Adds a typed JSON document in a specific index, making it searchable.
@@ -299,7 +297,9 @@
         return self.transport.perform_request('POST' if id in SKIP_IN_PATH 
else 'PUT',
             _make_path(index, doc_type, id), params=params, body=body)
 
-    @query_params('parent', 'preference', 'realtime', 'refresh', 'routing')
+    @query_params('_source', '_source_exclude', '_source_include', 'parent',
+        'preference', 'realtime', 'refresh', 'routing', 'stored_fields',
+        'version', 'version_type')
     def exists(self, index, doc_type, id, params=None):
         """
         Returns a boolean indicating whether or not given document exists in 
Elasticsearch.
@@ -309,6 +309,12 @@
         :arg doc_type: The type of the document (use `_all` to fetch the first
             document matching the ID across all types)
         :arg id: The document ID
+        :arg _source: True or false to return the _source field or not, or a
+            list of fields to return
+        :arg _source_exclude: A list of fields to exclude from the returned
+            _source field
+        :arg _source_include: A list of fields to extract and return from the
+            _source field
         :arg parent: The ID of the parent document
         :arg preference: Specify the node or shard the operation should be
             performed on (default: random)
@@ -317,6 +323,11 @@
         :arg refresh: Refresh the shard containing the document before
             performing the operation
         :arg routing: Specific routing value
+        :arg stored_fields: A comma-separated list of stored fields to return 
in
+            the response
+        :arg version: Explicit version number for concurrency control
+        :arg version_type: Specific version type, valid choices are: 
'internal',
+            'external', 'external_gte', 'force'
         """
         for param in (index, doc_type, id):
             if param in SKIP_IN_PATH:
@@ -325,6 +336,41 @@
             doc_type, id), params=params)
 
     @query_params('_source', '_source_exclude', '_source_include', 'parent',
+        'preference', 'realtime', 'refresh', 'routing', 'version',
+        'version_type')
+    def exists_source(self, index, doc_type, id, params=None):
+        """
+        
`<http://www.elastic.co/guide/en/elasticsearch/reference/master/docs-get.html>`_
+
+        :arg index: The name of the index
+        :arg doc_type: The type of the document; use `_all` to fetch the first
+            document matching the ID across all types
+        :arg id: The document ID
+        :arg _source: True or false to return the _source field or not, or a
+            list of fields to return
+        :arg _source_exclude: A list of fields to exclude from the returned
+            _source field
+        :arg _source_include: A list of fields to extract and return from the
+            _source field
+        :arg parent: The ID of the parent document
+        :arg preference: Specify the node or shard the operation should be
+            performed on (default: random)
+        :arg realtime: Specify whether to perform the operation in realtime or
+            search mode
+        :arg refresh: Refresh the shard containing the document before
+            performing the operation
+        :arg routing: Specific routing value
+        :arg version: Explicit version number for concurrency control
+        :arg version_type: Specific version type, valid choices are: 
'internal',
+            'external', 'external_gte', 'force'
+        """
+        for param in (index, doc_type, id):
+            if param in SKIP_IN_PATH:
+                raise ValueError("Empty value passed for a required argument.")
+        return self.transport.perform_request('HEAD', _make_path(index,
+            doc_type, id, '_source'), params=params)
+
+    @query_params('_source', '_source_exclude', '_source_include', 'parent',
         'preference', 'realtime', 'refresh', 'routing', 'stored_fields',
         'version', 'version_type')
     def get(self, index, id, doc_type='_all', params=None):
@@ -449,7 +495,7 @@
         :arg _source_include: A list of fields to extract and return from the
             _source field
         :arg fields: A comma-separated list of fields to return in the response
-        :arg lang: The script language (default: groovy)
+        :arg lang: The script language (default: painless)
         :arg parent: ID of the parent document. Is is only used for routing and
             when for the upsert request
         :arg refresh: If `true` then refresh the effected shards to make this
@@ -479,13 +525,14 @@
             doc_type, id, '_update'), params=params, body=body)
 
     @query_params('_source', '_source_exclude', '_source_include',
-        'allow_no_indices', 'analyze_wildcard', 'analyzer', 'default_operator',
-        'df', 'docvalue_fields', 'expand_wildcards', 'explain',
-        'fielddata_fields', 'from_', 'ignore_unavailable', 'lenient',
-        'lowercase_expanded_terms', 'preference', 'q', 'request_cache',
-        'routing', 'scroll', 'search_type', 'size', 'sort', 'stats',
-        'stored_fields', 'suggest_field', 'suggest_mode', 'suggest_size',
-        'suggest_text', 'terminate_after', 'timeout', 'track_scores', 
'version')
+        'allow_no_indices', 'analyze_wildcard', 'analyzer',
+        'batched_reduce_size', 'default_operator', 'df', 'docvalue_fields',
+        'expand_wildcards', 'explain', 'fielddata_fields', 'from_',
+        'ignore_unavailable', 'lenient', 'lowercase_expanded_terms',
+        'preference', 'q', 'request_cache', 'routing', 'scroll',
+        'search_type', 'size', 'sort', 'stats', 'stored_fields',
+        'suggest_field', 'suggest_mode', 'suggest_size', 'suggest_text',
+        'terminate_after', 'timeout', 'track_scores', 'typed_keys', 'version')
     def search(self, index=None, doc_type=None, body=None, params=None):
         """
         Execute a search query and get back search hits that match the query.
@@ -508,6 +555,11 @@
         :arg analyze_wildcard: Specify whether wildcard and prefix queries
             should be analyzed (default: false)
         :arg analyzer: The analyzer to use for the query string
+        :arg batched_reduce_size: The number of shard results that should be
+            reduced at once on the coordinating node. This value should be used
+            as a protection mechanism to reduce the memory overhead per search
+            request if the potential number of shards in the request can be
+            large., default 512
         :arg default_operator: The default operator for query string query (AND
             or OR), default 'OR', valid choices are: 'AND', 'OR'
         :arg df: The field to use as default where no field prefix is given in
@@ -556,6 +608,8 @@
         :arg timeout: Explicit operation timeout
         :arg track_scores: Whether to calculate and return scores even if they
             are not used for sorting
+        :arg typed_keys: Specify whether aggregation and suggester names should
+            be prefixed by their respective types in the response
         :arg version: Specify whether to return document version as part of a
             hit
         """
@@ -643,6 +697,9 @@
         :arg search_type: Search operation type, valid choices are:
             'query_then_fetch', 'dfs_query_then_fetch'
         :arg size: Number of hits to return (default: 10)
+        :arg slices: The number of slices this task should be divided into.
+            Defaults to 1 meaning the task isn't sliced into subtasks., default
+            1
         :arg sort: A comma-separated list of <field>:<direction> pairs
         :arg stats: Specific 'tag' of the request for logging and statistical
             purposes
@@ -672,14 +729,14 @@
             equal to the total number of copies for the shard (number of
             replicas + 1)
         :arg wait_for_completion: Should the request should block until the
-            reindex is complete., default False
+            reindex is complete., default True
         """
         if index in SKIP_IN_PATH:
             raise ValueError("Empty value passed for a required argument 
'index'.")
         return self.transport.perform_request('POST', _make_path(index,
             doc_type, '_update_by_query'), params=params, body=body)
 
-    @query_params('refresh', 'requests_per_second', 'timeout',
+    @query_params('refresh', 'requests_per_second', 'slices', 'timeout',
         'wait_for_active_shards', 'wait_for_completion')
     def reindex(self, body, params=None):
         """
@@ -692,6 +749,9 @@
         :arg requests_per_second: The throttle to set on this request in sub-
             requests per second. -1 means set no throttle as does "unlimited"
             which is the only non-float this accepts., default 0
+        :arg slices: The number of slices this task should be divided into.
+            Defaults to 1 meaning the task isn't sliced into subtasks., default
+            1
         :arg timeout: Time each individual bulk request should wait for shards
             that are unavailable., default '1m'
         :arg wait_for_active_shards: Sets the number of shard copies that must
@@ -700,7 +760,7 @@
             copies, otherwise set to any non-negative value less than or equal
             to the total number of copies for the shard (number of replicas + 
1)
         :arg wait_for_completion: Should the request should block until the
-            reindex is complete., default False
+            reindex is complete., default True
         """
         if body in SKIP_IN_PATH:
             raise ValueError("Empty value passed for a required argument 
'body'.")
@@ -725,7 +785,7 @@
         'default_operator', 'df', 'docvalue_fields', 'expand_wildcards',
         'explain', 'from_', 'ignore_unavailable', 'lenient',
         'lowercase_expanded_terms', 'preference', 'q', 'refresh',
-        'request_cache', 'requests_per_second', 'routing', 'scroll',
+        'request_cache', 'requests_per_second', 'routing', 'scroll', 'slices',
         'scroll_size', 'search_timeout', 'search_type', 'size', 'sort', 
'stats',
         'stored_fields', 'suggest_field', 'suggest_mode', 'suggest_size',
         'suggest_text', 'terminate_after', 'timeout', 'track_scores', 
'version',
@@ -779,7 +839,7 @@
         :arg request_cache: Specify if request cache should be used for this
             request or not, defaults to index level setting
         :arg requests_per_second: The throttle for this request in sub-requests
-            per second. -1 means set no throttle., default 0
+            per second. -1 means no throttle., default 0
         :arg routing: A comma-separated list of specific routing values
         :arg scroll: Specify how long a consistent view of the index should be
             maintained for scrolled search
@@ -790,6 +850,9 @@
         :arg search_type: Search operation type, valid choices are:
             'query_then_fetch', 'dfs_query_then_fetch'
         :arg size: Number of hits to return (default: 10)
+        :arg slices: The number of slices this task should be divided into.
+            Defaults to 1 meaning the task isn't sliced into subtasks., default
+            1
         :arg sort: A comma-separated list of <field>:<direction> pairs
         :arg stats: Specific 'tag' of the request for logging and statistical
             purposes
@@ -817,7 +880,7 @@
             equal to the total number of copies for the shard (number of
             replicas + 1)
         :arg wait_for_completion: Should the request should block until the
-            delete-by-query is complete., default False
+            delete-by-query is complete., default True
         """
         for param in (index, body):
             if param in SKIP_IN_PATH:
@@ -855,8 +918,9 @@
         return self.transport.perform_request('GET', _make_path(index,
             doc_type, '_search_shards'), params=params)
 
-    @query_params('allow_no_indices', 'expand_wildcards', 'ignore_unavailable',
-        'preference', 'routing', 'scroll', 'search_type')
+    @query_params('allow_no_indices', 'expand_wildcards', 'explain',
+        'ignore_unavailable', 'preference', 'profile', 'routing', 'scroll',
+        'search_type', 'typed_keys')
     def search_template(self, index=None, doc_type=None, body=None, 
params=None):
         """
         A query that accepts a query template and a map of key/value pairs to
@@ -874,16 +938,21 @@
         :arg expand_wildcards: Whether to expand wildcard expression to 
concrete
             indices that are open, closed or both., default 'open', valid
             choices are: 'open', 'closed', 'none', 'all'
+        :arg explain: Specify whether to return detailed information about 
score
+            computation as part of a hit
         :arg ignore_unavailable: Whether specified concrete indices should be
             ignored when unavailable (missing or closed)
         :arg preference: Specify the node or shard the operation should be
             performed on (default: random)
+        :arg profile: Specify whether to profile the query execution
         :arg routing: A comma-separated list of specific routing values
         :arg scroll: Specify how long a consistent view of the index should be
             maintained for scrolled search
         :arg search_type: Search operation type, valid choices are:
             'query_then_fetch', 'query_and_fetch', 'dfs_query_then_fetch',
             'dfs_query_and_fetch'
+        :arg typed_keys: Specify whether aggregation and suggester names should
+            be prefixed by their respective types in the response
         """
         return self.transport.perform_request('GET', _make_path(index,
             doc_type, '_search', 'template'), params=params, body=body)
@@ -1095,7 +1164,7 @@
         return self.transport.perform_request('POST', _make_path(index,
             doc_type, '_bulk'), params=params, body=self._bulk_body(body))
 
-    @query_params('max_concurrent_searches', 'search_type')
+    @query_params('max_concurrent_searches', 'search_type', 'typed_keys')
     def msearch(self, body, index=None, doc_type=None, params=None):
         """
         Execute several search requests within the same API.
@@ -1111,6 +1180,8 @@
         :arg search_type: Search operation type, valid choices are:
             'query_then_fetch', 'query_and_fetch', 'dfs_query_then_fetch',
             'dfs_query_and_fetch'
+        :arg typed_keys: Specify whether aggregation and suggester names should
+            be prefixed by their respective types in the response
         """
         if body in SKIP_IN_PATH:
             raise ValueError("Empty value passed for a required argument 
'body'.")
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/elasticsearch/client/nodes.py 
new/elasticsearch-5.4.0/elasticsearch/client/nodes.py
--- old/elasticsearch-5.2.0/elasticsearch/client/nodes.py       2017-02-08 
20:55:57.000000000 +0100
+++ new/elasticsearch-5.4.0/elasticsearch/client/nodes.py       2017-05-10 
16:38:38.000000000 +0200
@@ -57,7 +57,7 @@
         return self.transport.perform_request('GET', _make_path('_nodes',
             node_id, 'stats', metric, index_metric), params=params)
 
-    @query_params('doc_type', 'ignore_idle_threads', 'interval', 'snapshots',
+    @query_params('type', 'ignore_idle_threads', 'interval', 'snapshots',
         'threads', 'timeout')
     def hot_threads(self, node_id=None, params=None):
         """
@@ -68,7 +68,7 @@
             returned information; use `_local` to return information from the
             node you're connecting to, leave empty to get information from all
             nodes
-        :arg doc_type: The type to sample (default: cpu), valid choices are:
+        :arg type: The type to sample (default: cpu), valid choices are:
             'cpu', 'wait', 'block'
         :arg ignore_idle_threads: Don't show threads that are in known-idle
             places, such as waiting on a socket select or pulling from an empty
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/elasticsearch/compat.py 
new/elasticsearch-5.4.0/elasticsearch/compat.py
--- old/elasticsearch-5.2.0/elasticsearch/compat.py     2015-08-24 
22:02:18.000000000 +0200
+++ new/elasticsearch-5.4.0/elasticsearch/compat.py     2017-05-04 
11:22:21.000000000 +0200
@@ -4,10 +4,10 @@
 
 if PY2:
     string_types = basestring,
-    from urllib import quote_plus, urlencode
+    from urllib import quote_plus, urlencode, unquote
     from urlparse import  urlparse
     from itertools import imap as map
 else:
     string_types = str, bytes
-    from urllib.parse import quote_plus, urlencode, urlparse
+    from urllib.parse import quote_plus, urlencode, urlparse, unquote
     map = map
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/elasticsearch/connection/base.py 
new/elasticsearch-5.4.0/elasticsearch/connection/base.py
--- old/elasticsearch-5.2.0/elasticsearch/connection/base.py    2017-01-03 
15:55:15.000000000 +0100
+++ new/elasticsearch-5.4.0/elasticsearch/connection/base.py    2017-05-15 
22:04:19.000000000 +0200
@@ -24,8 +24,6 @@
 
     Also responsible for logging.
     """
-    transport_schema = 'http'
-
     def __init__(self, host='localhost', port=9200, use_ssl=False, 
url_prefix='', timeout=10, **kwargs):
         """
         :arg host: hostname of the node (default: localhost)
@@ -33,9 +31,12 @@
         :arg url_prefix: optional url prefix for elasticsearch
         :arg timeout: default timeout in seconds (float, default: 10)
         """
-        scheme = self.transport_schema
-        if use_ssl:
-            scheme += 's'
+        scheme = kwargs.get('scheme', 'http')
+        if use_ssl or scheme == 'https':
+            scheme = 'https'
+            use_ssl = True
+        self.use_ssl = use_ssl
+
         self.host = '%s://%s:%s' % (scheme, host, port)
         if url_prefix:
             url_prefix = '/' + url_prefix.strip('/')
@@ -61,7 +62,9 @@
         path = path.replace('?', '?pretty&', 1) if '?' in path else path + 
'?pretty'
         if self.url_prefix:
             path = path.replace(self.url_prefix, '', 1)
-        tracer.info("curl -X%s 'http://localhost:9200%s' -d '%s'", method, 
path, self._pretty_json(body) if body else '')
+        tracer.info("curl %s-X%s 'http://localhost:9200%s' -d '%s'",
+                    "-H 'Content-Type: application/json' " if body else '',
+                    method, path, self._pretty_json(body) if body else '')
 
         if tracer.isEnabledFor(logging.DEBUG):
             tracer.debug('#[%s] (%.3fs)\n#%s', status_code, duration, 
self._pretty_json(response).replace('\n', '\n#') if response else '')
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/elasticsearch-5.2.0/elasticsearch/connection/http_requests.py 
new/elasticsearch-5.4.0/elasticsearch/connection/http_requests.py
--- old/elasticsearch-5.2.0/elasticsearch/connection/http_requests.py   
2017-02-08 20:19:47.000000000 +0100
+++ new/elasticsearch-5.4.0/elasticsearch/connection/http_requests.py   
2017-05-15 22:04:19.000000000 +0200
@@ -32,7 +32,7 @@
         if not REQUESTS_AVAILABLE:
             raise ImproperlyConfigured("Please install requests to use 
RequestsHttpConnection.")
 
-        super(RequestsHttpConnection, self).__init__(host=host, port=port, 
**kwargs)
+        super(RequestsHttpConnection, self).__init__(host=host, port=port, 
use_ssl=use_ssl, **kwargs)
         self.session = requests.Session()
         self.session.headers = headers or {}
         self.session.headers.setdefault('content-type', 'application/json')
@@ -43,7 +43,7 @@
                 http_auth = tuple(http_auth.split(':', 1))
             self.session.auth = http_auth
         self.base_url = 'http%s://%s:%d%s' % (
-            's' if use_ssl else '',
+            's' if self.use_ssl else '',
             host, port, self.url_prefix
         )
         self.session.verify = verify_certs
@@ -57,7 +57,7 @@
                 raise ImproperlyConfigured("You cannot pass CA certificates 
when verify SSL is off.")
             self.session.verify = ca_certs
 
-        if use_ssl and not verify_certs:
+        if self.use_ssl and not verify_certs:
             warnings.warn(
                 'Connecting to %s using SSL with verify_certs=False is 
insecure.' % self.base_url)
 
@@ -67,12 +67,13 @@
             url = '%s?%s' % (url, urlencode(params or {}))
 
         start = time.time()
+        request = requests.Request(method=method, url=url, data=body)
+        prepared_request = self.session.prepare_request(request)
+        settings = 
self.session.merge_environment_settings(prepared_request.url, {}, None, None, 
None)
+        send_kwargs = {'timeout': timeout or self.timeout}
+        send_kwargs.update(settings)
         try:
-            request = requests.Request(method=method, url=url, data=body)
-            prepared_request = self.session.prepare_request(request)
-            response = self.session.send(
-                prepared_request,
-                timeout=timeout or self.timeout)
+            response = self.session.send(prepared_request, **send_kwargs)
             duration = time.time() - start
             raw_data = response.text
         except Exception as e:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/elasticsearch/connection_pool.py 
new/elasticsearch-5.4.0/elasticsearch/connection_pool.py
--- old/elasticsearch-5.2.0/elasticsearch/connection_pool.py    2016-05-16 
15:49:21.000000000 +0200
+++ new/elasticsearch-5.4.0/elasticsearch/connection_pool.py    2017-05-04 
11:22:21.000000000 +0200
@@ -1,6 +1,7 @@
 import time
 import random
 import logging
+import threading
 
 try:
     from Queue import PriorityQueue, Empty
@@ -58,12 +59,12 @@
     """
     def __init__(self, opts):
         super(RoundRobinSelector, self).__init__(opts)
-        self.rr = -1
+        self.data = threading.local()
 
     def select(self, connections):
-        self.rr += 1
-        self.rr %= len(connections)
-        return connections[self.rr]
+        self.data.rr = getattr(self.data, 'rr', -1) + 1
+        self.data.rr %= len(connections)
+        return connections[self.data.rr]
 
 class ConnectionPool(object):
     """
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/elasticsearch-5.2.0/elasticsearch/helpers/__init__.py 
new/elasticsearch-5.4.0/elasticsearch/helpers/__init__.py
--- old/elasticsearch-5.2.0/elasticsearch/helpers/__init__.py   2017-01-03 
15:55:15.000000000 +0100
+++ new/elasticsearch-5.4.0/elasticsearch/helpers/__init__.py   2017-05-07 
15:01:09.000000000 +0200
@@ -35,7 +35,8 @@
     op_type = data.pop('_op_type', 'index')
     action = {op_type: {}}
     for key in ('_index', '_parent', '_percolate', '_routing', '_timestamp',
-            '_ttl', '_type', '_version', '_version_type', '_id', 
'_retry_on_conflict'):
+                '_type', '_version', '_version_type', '_id',
+                '_retry_on_conflict', 'pipeline'):
         if key in data:
             action[op_type][key] = data.pop(key)
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/elasticsearch/transport.py 
new/elasticsearch-5.4.0/elasticsearch/transport.py
--- old/elasticsearch-5.2.0/elasticsearch/transport.py  2017-01-09 
14:35:01.000000000 +0100
+++ new/elasticsearch-5.4.0/elasticsearch/transport.py  2017-05-15 
22:04:19.000000000 +0200
@@ -151,12 +151,6 @@
             # previously unseen params, create new connection
             kwargs = self.kwargs.copy()
             kwargs.update(host)
-
-            if 'scheme' in host and host['scheme'] != 
self.connection_class.transport_schema:
-                raise ImproperlyConfigured(
-                    'Scheme specified in connection (%s) is not the same as 
the connection class (%s) specifies (%s).' % (
-                        host['scheme'], self.connection_class.__name__, 
self.connection_class.transport_schema
-                ))
             return self.connection_class(**kwargs)
         connections = map(_create_connection, hosts)
 
@@ -242,8 +236,8 @@
 
         hosts = list(filter(None, (self._get_host_info(n) for n in node_info)))
 
-        # we weren't able to get any nodes, maybe using an incompatible
-        # transport_schema or host_info_callback blocked all - raise error.
+        # we weren't able to get any nodes or host_info_callback blocked all -
+        # raise error.
         if not hosts:
             raise TransportError("N/A", "Unable to sniff hosts - no viable 
hosts found.")
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/elasticsearch.egg-info/PKG-INFO 
new/elasticsearch-5.4.0/elasticsearch.egg-info/PKG-INFO
--- old/elasticsearch-5.2.0/elasticsearch.egg-info/PKG-INFO     2017-02-12 
16:33:55.000000000 +0100
+++ new/elasticsearch-5.4.0/elasticsearch.egg-info/PKG-INFO     2017-05-18 
19:12:04.000000000 +0200
@@ -1,6 +1,6 @@
 Metadata-Version: 1.1
 Name: elasticsearch
-Version: 5.2.0
+Version: 5.4.0
 Summary: Python client for Elasticsearch
 Home-page: https://github.com/elastic/elasticsearch-py
 Author: Honza Král
@@ -74,6 +74,26 @@
             pip install elasticsearch
         
         
+        Run Elasticsearch in a Container
+        --------------------------------
+        
+        To run elasticsearch in a container, optionally set the `ES_VERSION` 
environment evariable to either 5.4, 5.3 or 2.4. `ES_VERSION` is defaulted to 
`latest`.
+        Then run ./start_elasticsearch.sh::
+        
+            export ES_VERSION=5.4
+            ./start_elasticsearch.sh
+        
+        
+        This will run a version fo Elastic Search in a Docker container 
suitable for running the tests. To check that elasticearch is running
+        first wait for a `healthy` status in `docker ps`::
+        
+            $ docker ps
+            CONTAINER ID        IMAGE                      COMMAND             
     CREATED             STATUS                   PORTS                         
     NAMES
+            955e57564e53        7d2ad83f8446               
"/docker-entrypoin..."   6 minutes ago       Up 6 minutes (healthy)   
0.0.0.0:9200->9200/tcp, 9300/tcp   trusting_brattain
+        
+        Then you can navigate to `locahost:9200` in your browser.
+        
+        
         Example use
         -----------
         
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/elasticsearch-5.2.0/elasticsearch.egg-info/SOURCES.txt 
new/elasticsearch-5.4.0/elasticsearch.egg-info/SOURCES.txt
--- old/elasticsearch-5.2.0/elasticsearch.egg-info/SOURCES.txt  2017-02-12 
16:33:55.000000000 +0100
+++ new/elasticsearch-5.4.0/elasticsearch.egg-info/SOURCES.txt  2017-05-18 
19:12:04.000000000 +0200
@@ -26,7 +26,6 @@
 elasticsearch.egg-info/PKG-INFO
 elasticsearch.egg-info/SOURCES.txt
 elasticsearch.egg-info/dependency_links.txt
-elasticsearch.egg-info/pbr.json
 elasticsearch.egg-info/requires.txt
 elasticsearch.egg-info/top_level.txt
 elasticsearch/client/__init__.py
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/elasticsearch.egg-info/pbr.json 
new/elasticsearch-5.4.0/elasticsearch.egg-info/pbr.json
--- old/elasticsearch-5.2.0/elasticsearch.egg-info/pbr.json     2016-11-04 
15:17:37.000000000 +0100
+++ new/elasticsearch-5.4.0/elasticsearch.egg-info/pbr.json     1970-01-01 
01:00:00.000000000 +0100
@@ -1 +0,0 @@
-{"is_release": false, "git_version": "29ce4bd"}
\ No newline at end of file
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/elasticsearch-5.2.0/elasticsearch.egg-info/requires.txt 
new/elasticsearch-5.4.0/elasticsearch.egg-info/requires.txt
--- old/elasticsearch-5.2.0/elasticsearch.egg-info/requires.txt 2017-02-12 
16:33:55.000000000 +0100
+++ new/elasticsearch-5.4.0/elasticsearch.egg-info/requires.txt 2017-05-18 
19:12:04.000000000 +0200
@@ -1 +1,11 @@
 urllib3>=1.8, <2.0
+
+[develop]
+requests>=2.0.0, <3.0.0
+nose
+coverage
+mock
+pyaml
+nosexcover
+sphinx
+sphinx_rtd_theme
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/elasticsearch-5.2.0/setup.py 
new/elasticsearch-5.4.0/setup.py
--- old/elasticsearch-5.2.0/setup.py    2017-02-12 16:24:54.000000000 +0100
+++ new/elasticsearch-5.4.0/setup.py    2017-05-18 19:10:33.000000000 +0200
@@ -3,7 +3,7 @@
 from setuptools import setup, find_packages
 import sys
 
-VERSION = (5, 2, 0)
+VERSION = (5, 4, 0)
 __version__ = VERSION
 __versionstr__ = '.'.join(map(str, VERSION))
 
@@ -15,7 +15,7 @@
     'urllib3>=1.8, <2.0',
 ]
 tests_require = [
-    'requests>=1.0.0, <3.0.0',
+    'requests>=2.0.0, <3.0.0',
     'nose',
     'coverage',
     'mock',
@@ -60,4 +60,8 @@
 
     test_suite='test_elasticsearch.run_tests.run_all',
     tests_require=tests_require,
+
+    extras_require={
+        'develop': tests_require + ["sphinx", "sphinx_rtd_theme"]
+    },
 )


Reply via email to