Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package python-s3fs for openSUSE:Factory 
checked in at 2022-02-01 14:02:46
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-s3fs (Old)
 and      /work/SRC/openSUSE:Factory/.python-s3fs.new.1898 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "python-s3fs"

Tue Feb  1 14:02:46 2022 rev:11 rq:949696 version:2022.1.0

Changes:
--------
--- /work/SRC/openSUSE:Factory/python-s3fs/python-s3fs.changes  2022-01-20 
00:12:55.082607430 +0100
+++ /work/SRC/openSUSE:Factory/.python-s3fs.new.1898/python-s3fs.changes        
2022-02-01 14:03:12.383970176 +0100
@@ -1,0 +2,9 @@
+Mon Jan 24 17:27:18 UTC 2022 - Ben Greiner <c...@bnavigator.de>
+
+- Update to 2022.01.1
+  * aiobotocore dep to 2.1.0 (#564)
+  * docs for non-aws (#567)
+  * ContentType in info (#570)
+  * small-file ACL (#574)
+
+-------------------------------------------------------------------

Old:
----
  s3fs-2021.11.1.tar.gz

New:
----
  s3fs-2022.1.0.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ python-s3fs.spec ++++++
--- /var/tmp/diff_new_pack.R0pibN/_old  2022-02-01 14:03:12.915966529 +0100
+++ /var/tmp/diff_new_pack.R0pibN/_new  2022-02-01 14:03:12.927966446 +0100
@@ -1,7 +1,7 @@
 #
 # spec file for package python-s3fs
 #
-# Copyright (c) 2021 SUSE LLC
+# Copyright (c) 2022 SUSE LLC
 #
 # All modifications and additions to the file contributed by third parties
 # remain the property of their copyright owners, unless otherwise agreed
@@ -19,14 +19,14 @@
 %{?!python_module:%define python_module() python3-%{**}}
 %define skip_python2 1
 Name:           python-s3fs
-Version:        2021.11.1
+Version:        2022.1.0
 Release:        0
 Summary:        Python filesystem interface for S3
 License:        BSD-3-Clause
 URL:            https://github.com/fsspec/s3fs/
 Source:         
https://files.pythonhosted.org/packages/source/s/s3fs/s3fs-%{version}.tar.gz
 BuildRequires:  %{python_module Flask}
-BuildRequires:  %{python_module aiobotocore >= 2.0.1}
+BuildRequires:  %{python_module aiobotocore >= 2.1.0}
 BuildRequires:  %{python_module aiohttp}
 BuildRequires:  %{python_module boto3}
 BuildRequires:  %{python_module fsspec = %{version}}
@@ -39,7 +39,7 @@
 %if %{with python2}
 BuildRequires:  python-mock
 %endif
-Requires:       python-aiobotocore >= 2.0.1
+Requires:       python-aiobotocore >= 2.1.0
 Requires:       python-aiohttp
 Requires:       python-fsspec = %{version}
 Recommends:     aws-cli

++++++ s3fs-2021.11.1.tar.gz -> s3fs-2022.1.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/s3fs-2021.11.1/PKG-INFO new/s3fs-2022.1.0/PKG-INFO
--- old/s3fs-2021.11.1/PKG-INFO 2021-11-26 23:17:20.484670200 +0100
+++ new/s3fs-2022.1.0/PKG-INFO  2022-01-11 20:52:39.153018700 +0100
@@ -1,6 +1,6 @@
 Metadata-Version: 2.1
 Name: s3fs
-Version: 2021.11.1
+Version: 2022.1.0
 Summary: Convenient Filesystem interface over S3
 Home-page: http://github.com/fsspec/s3fs/
 Maintainer: Martin Durant
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/s3fs-2021.11.1/docs/source/changelog.rst 
new/s3fs-2022.1.0/docs/source/changelog.rst
--- old/s3fs-2021.11.1/docs/source/changelog.rst        2021-11-26 
23:16:02.000000000 +0100
+++ new/s3fs-2022.1.0/docs/source/changelog.rst 2022-01-11 20:51:51.000000000 
+0100
@@ -1,6 +1,14 @@
 Changelog
 =========
 
+2022.01.0
+---------
+
+- aiobotocore dep to 2.1.0 (#564)
+- docs for non-aws (#567)
+- ContentType in info (#570)
+- small-file ACL (#574)
+
 2021.11.1
 ---------
 
@@ -20,7 +28,7 @@
 2021.10.1
 ---------
 
-- alow other methods than GET to url/sign (#536)
+- allow other methods than GET to url/sign (#536)
 
 2021.10.0
 ---------
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/s3fs-2021.11.1/docs/source/index.rst 
new/s3fs-2022.1.0/docs/source/index.rst
--- old/s3fs-2021.11.1/docs/source/index.rst    2021-04-20 15:13:11.000000000 
+0200
+++ new/s3fs-2022.1.0/docs/source/index.rst     2021-12-28 21:02:35.000000000 
+0100
@@ -153,8 +153,8 @@
 
 - for nodes on EC2, the IAM metadata provider
 
-You can specifiy a profile using `s3fs.S3FileSystem(profile='PROFILE')`.
-Othwerwise ``sf3s`` will use authentication via `boto environment variables`_.
+You can specify a profile using `s3fs.S3FileSystem(profile='PROFILE')`.
+Otherwise ``sf3s`` will use authentication via `boto environment variables`_.
 
 .. _boto environment variables: 
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html#using-environment-variables
 
@@ -172,21 +172,60 @@
 instance, so this method could be used in preference to the constructor in
 cases where the code must be agnostic of the credentials/config used.
 
-Self-hosted S3
---------------
+S3 Compatible Storage
+---------------------
 
-To use ``s3fs`` against your self hosted S3-compatible storage, like `MinIO`_ 
or
-`Ceph Object Gateway`_, you can set your custom ``endpoint_url`` when creating
-the ``s3fs`` filesystem:
+To use ``s3fs`` against an S3 compatible storage, like `MinIO`_ or
+`Ceph Object Gateway`_, you'll probably need to pass extra parameters when
+creating the ``s3fs`` filesystem. Here are some sample configurations:
 
-.. code-block:: python
+For a self-hosted MinIO instance:
 
+.. code-block:: python
+   # When relying on auto discovery for credentials
    >>> s3 = s3fs.S3FileSystem(
-         anon=false,
+         anon=False,
          client_kwargs={
             'endpoint_url': 'https://...'
          }
       )
+   # Or passing the credentials directly
+   >>> s3 = s3fs.S3FileSystem(
+         key='miniokey...',
+         secret='asecretkey...',
+         client_kwargs={
+            'endpoint_url': 'https://...'
+         }
+      )
+
+For a Scaleway s3-compatible storage in the ``fr-par`` zone:
+
+.. code-block:: python
+
+   >>> s3 = s3fs.S3FileSystem(
+      key='scaleway-api-key...',
+      secret='scaleway-secretkey...',
+      client_kwargs={
+         'endpoint_url': 'https://s3.fr-par.scw.cloud',
+         'region_name': 'fr-par'
+      }
+   )
+
+For an OVH s3-compatible storage in the ``GRA`` zone:
+
+.. code-block:: python
+
+   >>> s3 = s3fs.S3FileSystem(
+      key='ovh-s3-key...',
+      secret='ovh-s3-secretkey...',
+      client_kwargs={
+         'endpoint_url': 'https://s3.GRA.cloud.ovh.net',
+         'region_name': 'GRA'
+      },
+      config_kwargs={
+         'signature_version': 's3v4'
+      }
+   )
 
 
 .. _MinIO: https://min.io
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/s3fs-2021.11.1/requirements.txt 
new/s3fs-2022.1.0/requirements.txt
--- old/s3fs-2021.11.1/requirements.txt 2021-11-26 23:16:02.000000000 +0100
+++ new/s3fs-2022.1.0/requirements.txt  2022-01-11 20:51:51.000000000 +0100
@@ -1,3 +1,3 @@
-aiobotocore~=2.0.1
-fsspec==2021.11.1
+aiobotocore~=2.1.0
+fsspec==2022.01.0
 aiohttp<=4
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/s3fs-2021.11.1/s3fs/_version.py 
new/s3fs-2022.1.0/s3fs/_version.py
--- old/s3fs-2021.11.1/s3fs/_version.py 2021-11-26 23:17:20.485778000 +0100
+++ new/s3fs-2022.1.0/s3fs/_version.py  2022-01-11 20:52:39.154271000 +0100
@@ -8,11 +8,11 @@
 
 version_json = '''
 {
- "date": "2021-11-26T17:15:57-0500",
+ "date": "2022-01-11T14:51:45-0500",
  "dirty": false,
  "error": null,
- "full-revisionid": "a5991c68dd5a6939834af9294589b3878a039f99",
- "version": "2021.11.1"
+ "full-revisionid": "dfa1cd28467541cfafb25056a0402a49bacd219a",
+ "version": "2022.01.0"
 }
 '''  # END VERSION_JSON
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/s3fs-2021.11.1/s3fs/core.py 
new/s3fs-2022.1.0/s3fs/core.py
--- old/s3fs-2021.11.1/s3fs/core.py     2021-11-26 22:46:05.000000000 +0100
+++ new/s3fs-2022.1.0/s3fs/core.py      2022-01-05 19:07:23.000000000 +0100
@@ -1044,6 +1044,7 @@
                     "type": "file",
                     "StorageClass": "STANDARD",
                     "VersionId": out.get("VersionId"),
+                    "ContentType": out.get("ContentType"),
                 }
             except FileNotFoundError:
                 pass
@@ -1978,6 +1979,7 @@
                     Key=self.key,
                     Bucket=self.bucket,
                     Body=data,
+                    ACL=self.acl,
                     **self.kwargs,
                 )
             else:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/s3fs-2021.11.1/s3fs/tests/test_s3fs.py 
new/s3fs-2022.1.0/s3fs/tests/test_s3fs.py
--- old/s3fs-2021.11.1/s3fs/tests/test_s3fs.py  2021-11-05 20:51:13.000000000 
+0100
+++ new/s3fs-2022.1.0/s3fs/tests/test_s3fs.py   2022-01-05 19:07:23.000000000 
+0100
@@ -256,6 +256,7 @@
     linfo = s3.ls(a, detail=True)[0]
     assert abs(info.pop("LastModified") - linfo.pop("LastModified")).seconds < 
1
     info.pop("VersionId")
+    info.pop("ContentType")
     assert info == linfo
     parent = a.rsplit("/", 1)[0]
     s3.invalidate_cache()  # remove full path from the cache
@@ -663,7 +664,7 @@
     assert not s3.exists(test_bucket_name + "/nested/nested2/file1")
 
 
-@pytest.mark.xfail(reason="anon user is still priviliged on moto")
+@pytest.mark.xfail(reason="anon user is still privileged on moto")
 def test_anonymous_access(s3):
     with ignoring(NoCredentialsError):
         s3 = S3FileSystem(anon=True, client_kwargs={"endpoint_url": 
endpoint_uri})
@@ -1033,6 +1034,28 @@
     assert s3.info(test_bucket_name + "/test")["Size"] == 0
 
 
+def test_write_small_with_acl(s3):
+    bucket, key = (test_bucket_name, "test-acl")
+    filename = bucket + "/" + key
+    body = b"hello"
+    public_read_acl = {
+        "Permission": "READ",
+        "Grantee": {
+            "URI": "http://acs.amazonaws.com/groups/global/AllUsers";,
+            "Type": "Group",
+        },
+    }
+
+    with s3.open(filename, "wb", acl="public-read") as f:
+        f.write(body)
+    assert s3.cat(filename) == body
+
+    assert (
+        public_read_acl
+        in sync(s3.loop, s3.s3.get_object_acl, Bucket=bucket, 
Key=key)["Grants"]
+    )
+
+
 def test_write_large(s3):
     "flush() chunks buffer when processing large singular payload"
     mb = 2 ** 20
@@ -2123,32 +2146,32 @@
 
 
 def test_same_name_but_no_exact(s3):
-    s3.touch(test_bucket_name + "/very/similiar/prefix1")
-    s3.touch(test_bucket_name + "/very/similiar/prefix2")
-    s3.touch(test_bucket_name + "/very/similiar/prefix3/something")
-    assert not s3.exists(test_bucket_name + "/very/similiar/prefix")
-    assert not s3.exists(test_bucket_name + "/very/similiar/prefi")
-    assert not s3.exists(test_bucket_name + "/very/similiar/pref")
-
-    assert s3.exists(test_bucket_name + "/very/similiar/")
-    assert s3.exists(test_bucket_name + "/very/similiar/prefix1")
-    assert s3.exists(test_bucket_name + "/very/similiar/prefix2")
-    assert s3.exists(test_bucket_name + "/very/similiar/prefix3")
-    assert s3.exists(test_bucket_name + "/very/similiar/prefix3/")
-    assert s3.exists(test_bucket_name + "/very/similiar/prefix3/something")
-
-    assert not s3.exists(test_bucket_name + "/very/similiar/prefix3/some")
-
-    s3.touch(test_bucket_name + "/starting/very/similiar/prefix")
-
-    assert not s3.exists(test_bucket_name + "/starting/very/similiar/prefix1")
-    assert not s3.exists(test_bucket_name + "/starting/very/similiar/prefix2")
-    assert not s3.exists(test_bucket_name + "/starting/very/similiar/prefix3")
-    assert not s3.exists(test_bucket_name + "/starting/very/similiar/prefix3/")
-    assert not s3.exists(test_bucket_name + 
"/starting/very/similiar/prefix3/something")
+    s3.touch(test_bucket_name + "/very/similar/prefix1")
+    s3.touch(test_bucket_name + "/very/similar/prefix2")
+    s3.touch(test_bucket_name + "/very/similar/prefix3/something")
+    assert not s3.exists(test_bucket_name + "/very/similar/prefix")
+    assert not s3.exists(test_bucket_name + "/very/similar/prefi")
+    assert not s3.exists(test_bucket_name + "/very/similar/pref")
+
+    assert s3.exists(test_bucket_name + "/very/similar/")
+    assert s3.exists(test_bucket_name + "/very/similar/prefix1")
+    assert s3.exists(test_bucket_name + "/very/similar/prefix2")
+    assert s3.exists(test_bucket_name + "/very/similar/prefix3")
+    assert s3.exists(test_bucket_name + "/very/similar/prefix3/")
+    assert s3.exists(test_bucket_name + "/very/similar/prefix3/something")
+
+    assert not s3.exists(test_bucket_name + "/very/similar/prefix3/some")
+
+    s3.touch(test_bucket_name + "/starting/very/similar/prefix")
+
+    assert not s3.exists(test_bucket_name + "/starting/very/similar/prefix1")
+    assert not s3.exists(test_bucket_name + "/starting/very/similar/prefix2")
+    assert not s3.exists(test_bucket_name + "/starting/very/similar/prefix3")
+    assert not s3.exists(test_bucket_name + "/starting/very/similar/prefix3/")
+    assert not s3.exists(test_bucket_name + 
"/starting/very/similar/prefix3/something")
 
-    assert s3.exists(test_bucket_name + "/starting/very/similiar/prefix")
-    assert s3.exists(test_bucket_name + "/starting/very/similiar/prefix/")
+    assert s3.exists(test_bucket_name + "/starting/very/similar/prefix")
+    assert s3.exists(test_bucket_name + "/starting/very/similar/prefix/")
 
 
 def test_leading_forward_slash(s3):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/s3fs-2021.11.1/s3fs.egg-info/PKG-INFO 
new/s3fs-2022.1.0/s3fs.egg-info/PKG-INFO
--- old/s3fs-2021.11.1/s3fs.egg-info/PKG-INFO   2021-11-26 23:17:20.000000000 
+0100
+++ new/s3fs-2022.1.0/s3fs.egg-info/PKG-INFO    2022-01-11 20:52:38.000000000 
+0100
@@ -1,6 +1,6 @@
 Metadata-Version: 2.1
 Name: s3fs
-Version: 2021.11.1
+Version: 2022.1.0
 Summary: Convenient Filesystem interface over S3
 Home-page: http://github.com/fsspec/s3fs/
 Maintainer: Martin Durant
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/s3fs-2021.11.1/s3fs.egg-info/requires.txt 
new/s3fs-2022.1.0/s3fs.egg-info/requires.txt
--- old/s3fs-2021.11.1/s3fs.egg-info/requires.txt       2021-11-26 
23:17:20.000000000 +0100
+++ new/s3fs-2022.1.0/s3fs.egg-info/requires.txt        2022-01-11 
20:52:38.000000000 +0100
@@ -1,9 +1,9 @@
-aiobotocore~=2.0.1
-fsspec==2021.11.1
+aiobotocore~=2.1.0
+fsspec==2022.01.0
 aiohttp<=4
 
 [awscli]
-aiobotocore[awscli]~=2.0.1
+aiobotocore[awscli]~=2.1.0
 
 [boto3]
-aiobotocore[boto3]~=2.0.1
+aiobotocore[boto3]~=2.1.0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/s3fs-2021.11.1/versioneer.py 
new/s3fs-2022.1.0/versioneer.py
--- old/s3fs-2021.11.1/versioneer.py    2021-09-24 16:14:28.000000000 +0200
+++ new/s3fs-2022.1.0/versioneer.py     2021-12-13 21:26:38.000000000 +0100
@@ -719,7 +719,7 @@
         # TAG-NUM-gHEX
         mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe)
         if not mo:
-            # unparseable. Maybe git-describe is misbehaving?
+            # unparsable. Maybe git-describe is misbehaving?
             pieces["error"] = ("unable to parse git-describe output: '%%s'"
                                %% describe_out)
             return pieces
@@ -1216,7 +1216,7 @@
         # TAG-NUM-gHEX
         mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe)
         if not mo:
-            # unparseable. Maybe git-describe is misbehaving?
+            # unparsable. Maybe git-describe is misbehaving?
             pieces["error"] = ("unable to parse git-describe output: '%s'"
                                % describe_out)
             return pieces

Reply via email to