Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-python-json-logger for
openSUSE:Factory checked in at 2023-05-03 12:57:53
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-python-json-logger (Old)
and /work/SRC/openSUSE:Factory/.python-python-json-logger.new.1533 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-python-json-logger"
Wed May 3 12:57:53 2023 rev:8 rq:1084222 version:2.0.7
Changes:
--------
---
/work/SRC/openSUSE:Factory/python-python-json-logger/python-python-json-logger.changes
2022-10-08 01:25:49.066311229 +0200
+++
/work/SRC/openSUSE:Factory/.python-python-json-logger.new.1533/python-python-json-logger.changes
2023-05-03 12:57:57.956075272 +0200
@@ -1,0 +2,21 @@
+Wed May 3 09:09:17 UTC 2023 - Dirk Müller <[email protected]>
+
+- update to 2.0.7:
+ * Fix inclusion of py.typed in pip packages - @sth
+ * Added pytest support with test file rename. Migrated to
+ assertEqual
+ * Parameter `rename_fields` in merge_record_extra is now
+ optional - @afallou
+ * Allow reserved attrs to be renamed - @henkhogan
+ * Support added for Python 3.11
+ * Now verifying builds in Pypy 3.9 as well
+ * Type annotations are now in the package - @louis-jaris
+ * Fix rename_fields for exc_info - @guilhermeferrari
+ * Cleaned up test file for PEP8 - @lopagela
+ * Cleaned up old Python 2 artifacts - @louis-jaris
+ * Dropped Python 3.5 support - @idomozes
+ * Moved type check via tox into 3.11 run only
+ * Added test run in Python3.6 (will keep for a little while
+ longer, but it's EOL so upgrade)
+
+-------------------------------------------------------------------
Old:
----
python-json-logger-2.0.4.tar.gz
New:
----
python-json-logger-2.0.7.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-python-json-logger.spec ++++++
--- /var/tmp/diff_new_pack.mkys0C/_old 2023-05-03 12:57:58.432078069 +0200
+++ /var/tmp/diff_new_pack.mkys0C/_new 2023-05-03 12:57:58.436078092 +0200
@@ -1,7 +1,7 @@
#
# spec file for package python-python-json-logger
#
-# Copyright (c) 2022 SUSE LLC
+# Copyright (c) 2023 SUSE LLC
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -19,7 +19,7 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
%define skip_python2 1
Name: python-python-json-logger
-Version: 2.0.4
+Version: 2.0.7
Release: 0
Summary: A python library adding a json log formatter
License: BSD-2-Clause
++++++ python-json-logger-2.0.4.tar.gz -> python-json-logger-2.0.7.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/python-json-logger-2.0.4/PKG-INFO
new/python-json-logger-2.0.7/PKG-INFO
--- old/python-json-logger-2.0.4/PKG-INFO 2022-07-11 16:08:03.423041600
+0200
+++ new/python-json-logger-2.0.7/PKG-INFO 2023-02-21 18:40:04.544918000
+0100
@@ -1,26 +1,25 @@
Metadata-Version: 2.1
Name: python-json-logger
-Version: 2.0.4
+Version: 2.0.7
Summary: A python library adding a json log formatter
Home-page: http://github.com/madzak/python-json-logger
Author: Zakaria Zajac
Author-email: [email protected]
License: BSD
-Platform: UNKNOWN
Classifier: Development Status :: 6 - Mature
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: BSD License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
-Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
+Classifier: Programming Language :: Python :: 3.11
Classifier: Topic :: System :: Logging
-Requires-Python: >=3.5
+Requires-Python: >=3.6
Description-Content-Type: text/markdown
License-File: LICENSE
@@ -195,8 +194,6 @@
External Examples
=================
-- [Wesley Tanaka - Structured log files in Python using
python-json-logger](https://wtanaka.com/node/8201)
+- [Wesley Tanaka - Structured log files in Python using
python-json-logger](http://web.archive.org/web/20201130054012/https://wtanaka.com/node/8201)
-
[Archive](https://web.archive.org/web/20201130054012/https://wtanaka.com/node/8201)
-
-
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/python-json-logger-2.0.4/README.md
new/python-json-logger-2.0.7/README.md
--- old/python-json-logger-2.0.4/README.md 2022-07-11 16:07:55.000000000
+0200
+++ new/python-json-logger-2.0.7/README.md 2023-02-21 18:39:54.000000000
+0100
@@ -169,6 +169,6 @@
External Examples
=================
-- [Wesley Tanaka - Structured log files in Python using
python-json-logger](https://wtanaka.com/node/8201)
+- [Wesley Tanaka - Structured log files in Python using
python-json-logger](http://web.archive.org/web/20201130054012/https://wtanaka.com/node/8201)
-
[Archive](https://web.archive.org/web/20201130054012/https://wtanaka.com/node/8201)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/python-json-logger-2.0.4/setup.cfg
new/python-json-logger-2.0.7/setup.cfg
--- old/python-json-logger-2.0.4/setup.cfg 2022-07-11 16:08:03.423041600
+0200
+++ new/python-json-logger-2.0.7/setup.cfg 2023-02-21 18:40:04.544918000
+0100
@@ -1,6 +1,3 @@
-[bdist_wheel]
-python-tag = py3
-
[mypy]
mypy_path = src
namespace_packages = true
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/python-json-logger-2.0.4/setup.py
new/python-json-logger-2.0.7/setup.py
--- old/python-json-logger-2.0.4/setup.py 2022-07-11 16:07:55.000000000
+0200
+++ new/python-json-logger-2.0.7/setup.py 2023-02-21 18:39:54.000000000
+0100
@@ -8,7 +8,7 @@
setup(
name="python-json-logger",
- version="2.0.4",
+ version="2.0.7",
url="http://github.com/madzak/python-json-logger",
license="BSD",
include_package_data=True,
@@ -18,9 +18,10 @@
author="Zakaria Zajac",
author_email="[email protected]",
package_dir={'': 'src'},
+ package_data={"pythonjsonlogger": ["py.typed"]},
packages=find_packages("src", exclude="tests"),
#
https://packaging.python.org/guides/distributing-packages-using-setuptools/#python-requires
- python_requires='>=3.5',
+ python_requires=">=3.6",
test_suite="tests.tests",
classifiers=[
'Development Status :: 6 - Mature',
@@ -29,12 +30,12 @@
'Operating System :: OS Independent',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
- 'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
+ 'Programming Language :: Python :: 3.11',
'Topic :: System :: Logging',
]
)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/python-json-logger-2.0.4/src/python_json_logger.egg-info/PKG-INFO
new/python-json-logger-2.0.7/src/python_json_logger.egg-info/PKG-INFO
--- old/python-json-logger-2.0.4/src/python_json_logger.egg-info/PKG-INFO
2022-07-11 16:08:03.000000000 +0200
+++ new/python-json-logger-2.0.7/src/python_json_logger.egg-info/PKG-INFO
2023-02-21 18:40:04.000000000 +0100
@@ -1,26 +1,25 @@
Metadata-Version: 2.1
Name: python-json-logger
-Version: 2.0.4
+Version: 2.0.7
Summary: A python library adding a json log formatter
Home-page: http://github.com/madzak/python-json-logger
Author: Zakaria Zajac
Author-email: [email protected]
License: BSD
-Platform: UNKNOWN
Classifier: Development Status :: 6 - Mature
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: BSD License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
-Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
+Classifier: Programming Language :: Python :: 3.11
Classifier: Topic :: System :: Logging
-Requires-Python: >=3.5
+Requires-Python: >=3.6
Description-Content-Type: text/markdown
License-File: LICENSE
@@ -195,8 +194,6 @@
External Examples
=================
-- [Wesley Tanaka - Structured log files in Python using
python-json-logger](https://wtanaka.com/node/8201)
+- [Wesley Tanaka - Structured log files in Python using
python-json-logger](http://web.archive.org/web/20201130054012/https://wtanaka.com/node/8201)
-
[Archive](https://web.archive.org/web/20201130054012/https://wtanaka.com/node/8201)
-
-
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/python-json-logger-2.0.4/src/python_json_logger.egg-info/SOURCES.txt
new/python-json-logger-2.0.7/src/python_json_logger.egg-info/SOURCES.txt
--- old/python-json-logger-2.0.4/src/python_json_logger.egg-info/SOURCES.txt
2022-07-11 16:08:03.000000000 +0200
+++ new/python-json-logger-2.0.7/src/python_json_logger.egg-info/SOURCES.txt
2023-02-21 18:40:04.000000000 +0100
@@ -9,5 +9,6 @@
src/python_json_logger.egg-info/top_level.txt
src/pythonjsonlogger/__init__.py
src/pythonjsonlogger/jsonlogger.py
+src/pythonjsonlogger/py.typed
tests/__init__.py
-tests/tests.py
\ No newline at end of file
+tests/test_jsonlogger.py
\ No newline at end of file
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/python-json-logger-2.0.4/src/pythonjsonlogger/jsonlogger.py
new/python-json-logger-2.0.7/src/pythonjsonlogger/jsonlogger.py
--- old/python-json-logger-2.0.4/src/pythonjsonlogger/jsonlogger.py
2022-07-11 16:07:55.000000000 +0200
+++ new/python-json-logger-2.0.7/src/pythonjsonlogger/jsonlogger.py
2023-02-21 18:39:54.000000000 +0100
@@ -9,7 +9,7 @@
import traceback
import importlib
-from typing import Any, Dict, Union, List, Tuple
+from typing import Any, Dict, Optional, Union, List, Tuple
from inspect import istraceback
@@ -24,20 +24,30 @@
'processName', 'relativeCreated', 'stack_info', 'thread', 'threadName')
-def merge_record_extra(record: logging.LogRecord, target: Dict, reserved:
Union[Dict, List]) -> Dict:
+
+def merge_record_extra(
+ record: logging.LogRecord,
+ target: Dict,
+ reserved: Union[Dict, List],
+ rename_fields: Optional[Dict[str,str]] = None,
+) -> Dict:
"""
Merges extra attributes from LogRecord object into target dictionary
:param record: logging.LogRecord
:param target: dict to update
:param reserved: dict or list with reserved keys to skip
+ :param rename_fields: an optional dict, used to rename field names in the
output.
+ Rename levelname to log.level: {'levelname': 'log.level'}
"""
+ if rename_fields is None:
+ rename_fields = {}
for key, value in record.__dict__.items():
# this allows to have numeric keys
if (key not in reserved
and not (hasattr(key, "startswith")
and key.startswith('_'))):
- target[key] = value
+ target[rename_fields.get(key,key)] = value
return target
@@ -168,18 +178,23 @@
Override this method to implement custom logic for adding fields.
"""
for field in self._required_fields:
- if field in self.rename_fields:
- log_record[self.rename_fields[field]] =
record.__dict__.get(field)
- else:
- log_record[field] = record.__dict__.get(field)
+ log_record[field] = record.__dict__.get(field)
+
log_record.update(self.static_fields)
log_record.update(message_dict)
- merge_record_extra(record, log_record, reserved=self._skip_fields)
+ merge_record_extra(record, log_record, reserved=self._skip_fields,
rename_fields=self.rename_fields)
if self.timestamp:
key = self.timestamp if type(self.timestamp) == str else
'timestamp'
log_record[key] = datetime.fromtimestamp(record.created,
tz=timezone.utc)
+ self._perform_rename_log_fields(log_record)
+
+ def _perform_rename_log_fields(self, log_record):
+ for old_field_name, new_field_name in self.rename_fields.items():
+ log_record[new_field_name] = log_record[old_field_name]
+ del log_record[old_field_name]
+
def process_log_record(self, log_record):
"""
Override this method to implement custom logic
@@ -204,9 +219,9 @@
message_dict: Dict[str, Any] = {}
# FIXME: logging.LogRecord.msg and logging.LogRecord.message in
typeshed
# are always type of str. We shouldn't need to override that.
- if isinstance(record.msg, dict): # type: ignore
- message_dict = record.msg # type: ignore
- record.message = None
+ if isinstance(record.msg, dict):
+ message_dict = record.msg
+ record.message = ""
else:
record.message = record.getMessage()
# only format time if needed
@@ -221,15 +236,10 @@
message_dict['exc_info'] = record.exc_text
# Display formatted record of stack frames
# default format is a string returned from
:func:`traceback.print_stack`
- try:
- if record.stack_info and not message_dict.get('stack_info'):
- message_dict['stack_info'] =
self.formatStack(record.stack_info)
- except AttributeError:
- # Python2.7 doesn't have stack_info.
- pass
+ if record.stack_info and not message_dict.get('stack_info'):
+ message_dict['stack_info'] = self.formatStack(record.stack_info)
- log_record: Dict[str, Any]
- log_record = OrderedDict()
+ log_record: Dict[str, Any] = OrderedDict()
self.add_fields(log_record, record, message_dict)
log_record = self.process_log_record(log_record)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/python-json-logger-2.0.4/src/pythonjsonlogger/py.typed
new/python-json-logger-2.0.7/src/pythonjsonlogger/py.typed
--- old/python-json-logger-2.0.4/src/pythonjsonlogger/py.typed 1970-01-01
01:00:00.000000000 +0100
+++ new/python-json-logger-2.0.7/src/pythonjsonlogger/py.typed 2023-02-21
18:39:54.000000000 +0100
@@ -0,0 +1 @@
+# PEP-561 marker. https://mypy.readthedocs.io/en/latest/installed_packages.html
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/python-json-logger-2.0.4/tests/test_jsonlogger.py
new/python-json-logger-2.0.7/tests/test_jsonlogger.py
--- old/python-json-logger-2.0.4/tests/test_jsonlogger.py 1970-01-01
01:00:00.000000000 +0100
+++ new/python-json-logger-2.0.7/tests/test_jsonlogger.py 2023-02-21
18:39:54.000000000 +0100
@@ -0,0 +1,317 @@
+# -*- coding: utf-8 -*-
+import unittest
+import unittest.mock
+import logging
+import json
+import sys
+import traceback
+import random
+
+try:
+ import xmlrunner # noqa
+except ImportError:
+ pass
+
+from io import StringIO
+
+sys.path.append('src/python-json-logger')
+from pythonjsonlogger import jsonlogger
+import datetime
+
+
+class TestJsonLogger(unittest.TestCase):
+ def setUp(self):
+ self.log =
logging.getLogger("logging-test-{}".format(random.randint(1, 101)))
+ self.log.setLevel(logging.DEBUG)
+ self.buffer = StringIO()
+
+ self.log_handler = logging.StreamHandler(self.buffer)
+ self.log.addHandler(self.log_handler)
+
+ def test_default_format(self):
+ fr = jsonlogger.JsonFormatter()
+ self.log_handler.setFormatter(fr)
+
+ msg = "testing logging format"
+ self.log.info(msg)
+ log_json = json.loads(self.buffer.getvalue())
+
+ self.assertEqual(log_json["message"], msg)
+
+ def test_percentage_format(self):
+ fr = jsonlogger.JsonFormatter(
+ # All kind of different styles to check the regex
+ '[%(levelname)8s] %(message)s %(filename)s:%(lineno)d %(asctime)'
+ )
+ self.log_handler.setFormatter(fr)
+
+ msg = "testing logging format"
+ self.log.info(msg)
+ log_json = json.loads(self.buffer.getvalue())
+
+ self.assertEqual(log_json["message"], msg)
+ self.assertEqual(log_json.keys(), {'levelname', 'message', 'filename',
'lineno', 'asctime'})
+
+ def test_rename_base_field(self):
+ fr = jsonlogger.JsonFormatter(rename_fields={'message': '@message'})
+ self.log_handler.setFormatter(fr)
+
+ msg = "testing logging format"
+ self.log.info(msg)
+ log_json = json.loads(self.buffer.getvalue())
+
+ self.assertEqual(log_json["@message"], msg)
+
+ def test_rename_nonexistent_field(self):
+ fr = jsonlogger.JsonFormatter(rename_fields={'nonexistent_key':
'new_name'})
+ self.log_handler.setFormatter(fr)
+
+ stderr_watcher = StringIO()
+ sys.stderr = stderr_watcher
+ self.log.info("testing logging rename")
+
+ self.assertTrue("KeyError: 'nonexistent_key'" in
stderr_watcher.getvalue())
+
+ def test_add_static_fields(self):
+ fr = jsonlogger.JsonFormatter(static_fields={'log_stream': 'kafka'})
+
+ self.log_handler.setFormatter(fr)
+
+ msg = "testing static fields"
+ self.log.info(msg)
+ log_json = json.loads(self.buffer.getvalue())
+
+ self.assertEqual(log_json["log_stream"], "kafka")
+ self.assertEqual(log_json["message"], msg)
+
+ def test_format_keys(self):
+ supported_keys = [
+ 'asctime',
+ 'created',
+ 'filename',
+ 'funcName',
+ 'levelname',
+ 'levelno',
+ 'lineno',
+ 'module',
+ 'msecs',
+ 'message',
+ 'name',
+ 'pathname',
+ 'process',
+ 'processName',
+ 'relativeCreated',
+ 'thread',
+ 'threadName'
+ ]
+
+ log_format = lambda x: ['%({0:s})s'.format(i) for i in x]
+ custom_format = ' '.join(log_format(supported_keys))
+
+ fr = jsonlogger.JsonFormatter(custom_format)
+ self.log_handler.setFormatter(fr)
+
+ msg = "testing logging format"
+ self.log.info(msg)
+ log_msg = self.buffer.getvalue()
+ log_json = json.loads(log_msg)
+
+ for supported_key in supported_keys:
+ if supported_key in log_json:
+ self.assertTrue(True)
+
+ def test_unknown_format_key(self):
+ fr = jsonlogger.JsonFormatter('%(unknown_key)s %(message)s')
+
+ self.log_handler.setFormatter(fr)
+ msg = "testing unknown logging format"
+ try:
+ self.log.info(msg)
+ except Exception:
+ self.assertTrue(False, "Should succeed")
+
+ def test_log_adict(self):
+ fr = jsonlogger.JsonFormatter()
+ self.log_handler.setFormatter(fr)
+
+ msg = {"text": "testing logging", "num": 1, 5: "9",
+ "nested": {"more": "data"}}
+
+ self.log.info(msg)
+ log_json = json.loads(self.buffer.getvalue())
+ self.assertEqual(log_json.get("text"), msg["text"])
+ self.assertEqual(log_json.get("num"), msg["num"])
+ self.assertEqual(log_json.get("5"), msg[5])
+ self.assertEqual(log_json.get("nested"), msg["nested"])
+ self.assertEqual(log_json["message"], "")
+
+ def test_log_extra(self):
+ fr = jsonlogger.JsonFormatter()
+ self.log_handler.setFormatter(fr)
+
+ extra = {"text": "testing logging", "num": 1, 5: "9",
+ "nested": {"more": "data"}}
+ self.log.info("hello", extra=extra)
+ log_json = json.loads(self.buffer.getvalue())
+ self.assertEqual(log_json.get("text"), extra["text"])
+ self.assertEqual(log_json.get("num"), extra["num"])
+ self.assertEqual(log_json.get("5"), extra[5])
+ self.assertEqual(log_json.get("nested"), extra["nested"])
+ self.assertEqual(log_json["message"], "hello")
+
+ def test_json_default_encoder(self):
+ fr = jsonlogger.JsonFormatter()
+ self.log_handler.setFormatter(fr)
+
+ msg = {"adate": datetime.datetime(1999, 12, 31, 23, 59),
+ "otherdate": datetime.date(1789, 7, 14),
+ "otherdatetime": datetime.datetime(1789, 7, 14, 23, 59),
+ "otherdatetimeagain": datetime.datetime(1900, 1, 1)}
+ self.log.info(msg)
+ log_json = json.loads(self.buffer.getvalue())
+ self.assertEqual(log_json.get("adate"), "1999-12-31T23:59:00")
+ self.assertEqual(log_json.get("otherdate"), "1789-07-14")
+ self.assertEqual(log_json.get("otherdatetime"), "1789-07-14T23:59:00")
+ self.assertEqual(log_json.get("otherdatetimeagain"),
+ "1900-01-01T00:00:00")
+
+ @unittest.mock.patch('time.time', return_value=1500000000.0)
+ def test_json_default_encoder_with_timestamp(self, time_mock):
+ fr = jsonlogger.JsonFormatter(timestamp=True)
+ self.log_handler.setFormatter(fr)
+
+ self.log.info("Hello")
+
+ self.assertTrue(time_mock.called)
+ log_json = json.loads(self.buffer.getvalue())
+ self.assertEqual(log_json.get("timestamp"),
"2017-07-14T02:40:00+00:00")
+
+ def test_json_custom_default(self):
+ def custom(o):
+ return "very custom"
+ fr = jsonlogger.JsonFormatter(json_default=custom)
+ self.log_handler.setFormatter(fr)
+
+ msg = {"adate": datetime.datetime(1999, 12, 31, 23, 59),
+ "normal": "value"}
+ self.log.info(msg)
+ log_json = json.loads(self.buffer.getvalue())
+ self.assertEqual(log_json.get("adate"), "very custom")
+ self.assertEqual(log_json.get("normal"), "value")
+
+ def test_json_custom_logic_adds_field(self):
+ class CustomJsonFormatter(jsonlogger.JsonFormatter):
+
+ def process_log_record(self, log_record):
+ log_record["custom"] = "value"
+ # Old Style "super" since Python 2.6's logging.Formatter is old
+ # style
+ return jsonlogger.JsonFormatter.process_log_record(self,
log_record)
+
+ self.log_handler.setFormatter(CustomJsonFormatter())
+ self.log.info("message")
+ log_json = json.loads(self.buffer.getvalue())
+ self.assertEqual(log_json.get("custom"), "value")
+
+ def get_traceback_from_exception_followed_by_log_call(self) -> str:
+ try:
+ raise Exception('test')
+ except Exception:
+ self.log.exception("hello")
+ str_traceback = traceback.format_exc()
+ # Formatter removes trailing new line
+ if str_traceback.endswith('\n'):
+ str_traceback = str_traceback[:-1]
+
+ return str_traceback
+
+ def test_exc_info(self):
+ fr = jsonlogger.JsonFormatter()
+ self.log_handler.setFormatter(fr)
+ expected_value =
self.get_traceback_from_exception_followed_by_log_call()
+
+ log_json = json.loads(self.buffer.getvalue())
+ self.assertEqual(log_json.get("exc_info"), expected_value)
+
+ def test_exc_info_renamed(self):
+ fr = jsonlogger.JsonFormatter("%(exc_info)s",
rename_fields={"exc_info": "stack_trace"})
+ self.log_handler.setFormatter(fr)
+ expected_value =
self.get_traceback_from_exception_followed_by_log_call()
+
+ log_json = json.loads(self.buffer.getvalue())
+ self.assertEqual(log_json.get("stack_trace"), expected_value)
+ self.assertEqual(log_json.get("exc_info"), None)
+
+ def test_ensure_ascii_true(self):
+ fr = jsonlogger.JsonFormatter()
+ self.log_handler.setFormatter(fr)
+ self.log.info('ÐÑивеÑ')
+ msg = self.buffer.getvalue().split('"message": "', 1)[1].split('"',
1)[0]
+ self.assertEqual(msg, r"\u041f\u0440\u0438\u0432\u0435\u0442")
+
+ def test_ensure_ascii_false(self):
+ fr = jsonlogger.JsonFormatter(json_ensure_ascii=False)
+ self.log_handler.setFormatter(fr)
+ self.log.info('ÐÑивеÑ')
+ msg = self.buffer.getvalue().split('"message": "', 1)[1].split('"',
1)[0]
+ self.assertEqual(msg, "ÐÑивеÑ")
+
+ def test_custom_object_serialization(self):
+ def encode_complex(z):
+ if isinstance(z, complex):
+ return (z.real, z.imag)
+ else:
+ type_name = z.__class__.__name__
+ raise TypeError("Object of type '{}' is no JSON
serializable".format(type_name))
+
+ formatter = jsonlogger.JsonFormatter(json_default=encode_complex,
+ json_encoder=json.JSONEncoder)
+ self.log_handler.setFormatter(formatter)
+
+ value = {
+ "special": complex(3, 8),
+ }
+
+ self.log.info(" message", extra=value)
+ msg = self.buffer.getvalue()
+ self.assertEqual(msg, "{\"message\": \" message\", \"special\": [3.0,
8.0]}\n")
+
+ def test_rename_reserved_attrs(self):
+ log_format = lambda x: ['%({0:s})s'.format(i) for i in x]
+ reserved_attrs_map = {
+ 'exc_info': 'error.type',
+ 'exc_text': 'error.message',
+ 'funcName': 'log.origin.function',
+ 'levelname': 'log.level',
+ 'module': 'log.origin.file.name',
+ 'processName': 'process.name',
+ 'threadName': 'process.thread.name',
+ 'msg': 'log.message'
+ }
+
+ custom_format = ' '.join(log_format(reserved_attrs_map.keys()))
+ reserved_attrs = [_ for _ in jsonlogger.RESERVED_ATTRS if _ not in
list(reserved_attrs_map.keys())]
+ formatter = jsonlogger.JsonFormatter(custom_format,
reserved_attrs=reserved_attrs, rename_fields=reserved_attrs_map)
+ self.log_handler.setFormatter(formatter)
+ self.log.info("message")
+
+ msg = self.buffer.getvalue()
+ self.assertEqual(msg, '{"error.type": null, "error.message": null,
"log.origin.function": "test_rename_reserved_attrs", "log.level": "INFO",
"log.origin.file.name": "test_jsonlogger", "process.name": "MainProcess",
"process.thread.name": "MainThread", "log.message": "message"}\n')
+
+ def test_merge_record_extra(self):
+ record = logging.LogRecord("name", level=1, pathname="", lineno=1,
msg="Some message", args=None, exc_info=None)
+ output = jsonlogger.merge_record_extra(record, target=dict(foo="bar"),
reserved=[])
+ self.assertIn("foo", output)
+ self.assertIn("msg", output)
+ self.assertEqual(output["foo"], "bar")
+ self.assertEqual(output["msg"], "Some message")
+
+
+if __name__ == '__main__':
+ if len(sys.argv[1:]) > 0:
+ if sys.argv[1] == 'xml':
+ testSuite = unittest.TestLoader().loadTestsFromTestCase(
+ TestJsonLogger)
+ xmlrunner.XMLTestRunner(output='reports').run(testSuite)
+ else:
+ unittest.main()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/python-json-logger-2.0.4/tests/tests.py
new/python-json-logger-2.0.7/tests/tests.py
--- old/python-json-logger-2.0.4/tests/tests.py 2022-07-11 16:07:55.000000000
+0200
+++ new/python-json-logger-2.0.7/tests/tests.py 1970-01-01 01:00:00.000000000
+0100
@@ -1,268 +0,0 @@
-# -*- coding: utf-8 -*-
-import unittest
-import unittest.mock
-import logging
-import json
-import sys
-import traceback
-import random
-
-try:
- import xmlrunner # noqa
-except ImportError:
- pass
-
-try:
- from StringIO import StringIO # noqa
-except ImportError:
- # Python 3 Support
- from io import StringIO
-
-sys.path.append('src/python-json-logger')
-from pythonjsonlogger import jsonlogger
-import datetime
-
-
-class TestJsonLogger(unittest.TestCase):
- def setUp(self):
- self.logger =
logging.getLogger("logging-test-{}".format(random.randint(1, 101)))
- self.logger.setLevel(logging.DEBUG)
- self.buffer = StringIO()
-
- self.logHandler = logging.StreamHandler(self.buffer)
- self.logger.addHandler(self.logHandler)
-
- def testDefaultFormat(self):
- fr = jsonlogger.JsonFormatter()
- self.logHandler.setFormatter(fr)
-
- msg = "testing logging format"
- self.logger.info(msg)
- logJson = json.loads(self.buffer.getvalue())
-
- self.assertEqual(logJson["message"], msg)
-
- def testPercentageFormat(self):
- fr = jsonlogger.JsonFormatter(
- # All kind of different styles to check the regex
- '[%(levelname)8s] %(message)s %(filename)s:%(lineno)d %(asctime)'
- )
- self.logHandler.setFormatter(fr)
-
- msg = "testing logging format"
- self.logger.info(msg)
- logJson = json.loads(self.buffer.getvalue())
-
- self.assertEqual(logJson["message"], msg)
- self.assertEqual(logJson.keys(), {'levelname', 'message', 'filename',
'lineno', 'asctime'})
-
- def testRenameBaseField(self):
- fr = jsonlogger.JsonFormatter(rename_fields={'message': '@message'})
- self.logHandler.setFormatter(fr)
-
- msg = "testing logging format"
- self.logger.info(msg)
- logJson = json.loads(self.buffer.getvalue())
-
- self.assertEqual(logJson["@message"], msg)
-
- def testAddStaticFields(self):
- fr = jsonlogger.JsonFormatter(static_fields={'log_stream': 'kafka'})
-
- self.logHandler.setFormatter(fr)
-
- msg = "testing static fields"
- self.logger.info(msg)
- logJson = json.loads(self.buffer.getvalue())
-
- self.assertEqual(logJson["log_stream"], "kafka")
- self.assertEqual(logJson["message"], msg)
-
-
- def testFormatKeys(self):
- supported_keys = [
- 'asctime',
- 'created',
- 'filename',
- 'funcName',
- 'levelname',
- 'levelno',
- 'lineno',
- 'module',
- 'msecs',
- 'message',
- 'name',
- 'pathname',
- 'process',
- 'processName',
- 'relativeCreated',
- 'thread',
- 'threadName'
- ]
-
- log_format = lambda x: ['%({0:s})s'.format(i) for i in x]
- custom_format = ' '.join(log_format(supported_keys))
-
- fr = jsonlogger.JsonFormatter(custom_format)
- self.logHandler.setFormatter(fr)
-
- msg = "testing logging format"
- self.logger.info(msg)
- log_msg = self.buffer.getvalue()
- log_json = json.loads(log_msg)
-
- for supported_key in supported_keys:
- if supported_key in log_json:
- self.assertTrue(True)
-
- def testUnknownFormatKey(self):
- fr = jsonlogger.JsonFormatter('%(unknown_key)s %(message)s')
-
- self.logHandler.setFormatter(fr)
- msg = "testing unknown logging format"
- try:
- self.logger.info(msg)
- except:
- self.assertTrue(False, "Should succeed")
-
- def testLogADict(self):
- fr = jsonlogger.JsonFormatter()
- self.logHandler.setFormatter(fr)
-
- msg = {"text": "testing logging", "num": 1, 5: "9",
- "nested": {"more": "data"}}
- self.logger.info(msg)
- logJson = json.loads(self.buffer.getvalue())
- self.assertEqual(logJson.get("text"), msg["text"])
- self.assertEqual(logJson.get("num"), msg["num"])
- self.assertEqual(logJson.get("5"), msg[5])
- self.assertEqual(logJson.get("nested"), msg["nested"])
- self.assertEqual(logJson["message"], None)
-
- def testLogExtra(self):
- fr = jsonlogger.JsonFormatter()
- self.logHandler.setFormatter(fr)
-
- extra = {"text": "testing logging", "num": 1, 5: "9",
- "nested": {"more": "data"}}
- self.logger.info("hello", extra=extra)
- logJson = json.loads(self.buffer.getvalue())
- self.assertEqual(logJson.get("text"), extra["text"])
- self.assertEqual(logJson.get("num"), extra["num"])
- self.assertEqual(logJson.get("5"), extra[5])
- self.assertEqual(logJson.get("nested"), extra["nested"])
- self.assertEqual(logJson["message"], "hello")
-
- def testJsonDefaultEncoder(self):
- fr = jsonlogger.JsonFormatter()
- self.logHandler.setFormatter(fr)
-
- msg = {"adate": datetime.datetime(1999, 12, 31, 23, 59),
- "otherdate": datetime.date(1789, 7, 14),
- "otherdatetime": datetime.datetime(1789, 7, 14, 23, 59),
- "otherdatetimeagain": datetime.datetime(1900, 1, 1)}
- self.logger.info(msg)
- logJson = json.loads(self.buffer.getvalue())
- self.assertEqual(logJson.get("adate"), "1999-12-31T23:59:00")
- self.assertEqual(logJson.get("otherdate"), "1789-07-14")
- self.assertEqual(logJson.get("otherdatetime"), "1789-07-14T23:59:00")
- self.assertEqual(logJson.get("otherdatetimeagain"),
- "1900-01-01T00:00:00")
-
- @unittest.mock.patch('time.time', return_value=1500000000.0)
- def testJsonDefaultEncoderWithTimestamp(self, time_mock):
- fr = jsonlogger.JsonFormatter(timestamp=True)
- self.logHandler.setFormatter(fr)
-
- self.logger.info("Hello")
-
- self.assertTrue(time_mock.called)
- logJson = json.loads(self.buffer.getvalue())
- self.assertEqual(logJson.get("timestamp"), "2017-07-14T02:40:00+00:00")
-
- def testJsonCustomDefault(self):
- def custom(o):
- return "very custom"
- fr = jsonlogger.JsonFormatter(json_default=custom)
- self.logHandler.setFormatter(fr)
-
- msg = {"adate": datetime.datetime(1999, 12, 31, 23, 59),
- "normal": "value"}
- self.logger.info(msg)
- logJson = json.loads(self.buffer.getvalue())
- self.assertEqual(logJson.get("adate"), "very custom")
- self.assertEqual(logJson.get("normal"), "value")
-
- def testJsonCustomLogicAddsField(self):
- class CustomJsonFormatter(jsonlogger.JsonFormatter):
-
- def process_log_record(self, log_record):
- log_record["custom"] = "value"
- # Old Style "super" since Python 2.6's logging.Formatter is old
- # style
- return jsonlogger.JsonFormatter.process_log_record(self,
log_record)
-
- self.logHandler.setFormatter(CustomJsonFormatter())
- self.logger.info("message")
- logJson = json.loads(self.buffer.getvalue())
- self.assertEqual(logJson.get("custom"), "value")
-
- def testExcInfo(self):
- fr = jsonlogger.JsonFormatter()
- self.logHandler.setFormatter(fr)
- try:
- raise Exception('test')
- except Exception:
-
- self.logger.exception("hello")
-
- expected_value = traceback.format_exc()
- # Formatter removes trailing new line
- if expected_value.endswith('\n'):
- expected_value = expected_value[:-1]
-
- logJson = json.loads(self.buffer.getvalue())
- self.assertEqual(logJson.get("exc_info"), expected_value)
-
- def testEnsureAsciiTrue(self):
- fr = jsonlogger.JsonFormatter()
- self.logHandler.setFormatter(fr)
- self.logger.info('ÐÑивеÑ')
- msg = self.buffer.getvalue().split('"message": "', 1)[1].split('"',
1)[0]
- self.assertEqual(msg, r"\u041f\u0440\u0438\u0432\u0435\u0442")
-
- def testEnsureAsciiFalse(self):
- fr = jsonlogger.JsonFormatter(json_ensure_ascii=False)
- self.logHandler.setFormatter(fr)
- self.logger.info('ÐÑивеÑ')
- msg = self.buffer.getvalue().split('"message": "', 1)[1].split('"',
1)[0]
- self.assertEqual(msg, "ÐÑивеÑ")
-
- def testCustomObjectSerialization(self):
- def encode_complex(z):
- if isinstance(z, complex):
- return (z.real, z.imag)
- else:
- type_name = z.__class__.__name__
- raise TypeError("Object of type '{}' is no JSON
serializable".format(type_name))
-
- formatter = jsonlogger.JsonFormatter(json_default=encode_complex,
- json_encoder=json.JSONEncoder)
- self.logHandler.setFormatter(formatter)
-
- value = {
- "special": complex(3, 8),
- }
-
- self.logger.info(" message", extra=value)
- msg = self.buffer.getvalue()
- self.assertEqual(msg, "{\"message\": \" message\", \"special\": [3.0,
8.0]}\n")
-
-if __name__ == '__main__':
- if len(sys.argv[1:]) > 0:
- if sys.argv[1] == 'xml':
- testSuite = unittest.TestLoader().loadTestsFromTestCase(
- TestJsonLogger)
- xmlrunner.XMLTestRunner(output='reports').run(testSuite)
- else:
- unittest.main()