Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package translate-toolkit for 
openSUSE:Factory checked in at 2026-03-26 21:07:38
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/translate-toolkit (Old)
 and      /work/SRC/openSUSE:Factory/.translate-toolkit.new.8177 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "translate-toolkit"

Thu Mar 26 21:07:38 2026 rev:68 rq:1342426 version:3.19.3

Changes:
--------
--- /work/SRC/openSUSE:Factory/translate-toolkit/translate-toolkit.changes      
2026-03-05 17:14:59.414255852 +0100
+++ 
/work/SRC/openSUSE:Factory/.translate-toolkit.new.8177/translate-toolkit.changes
    2026-03-27 06:51:30.942937947 +0100
@@ -1,0 +2,15 @@
+Tue Mar 24 22:57:05 UTC 2026 - Dirk Müller <[email protected]>
+
+- update to 3.19.3
+  * Markdown:
+    - Support extracting code blocks for translation.
+    - Added --no-code-blocks option to md2po and po2md.
+  * Apple strings (.strings):
+    - Improved preservation of whitespace and comments.
+    - Generic comments are now hidden from notes but
+      preserved on round-trip.
+  * CI/Development:
+    - Switched from pre-commit to prek.
+  * Fixed blank line preservation in property converters.
+
+-------------------------------------------------------------------

Old:
----
  3.19.2.tar.gz

New:
----
  3.19.3.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ translate-toolkit.spec ++++++
--- /var/tmp/diff_new_pack.uE4z1D/_old  2026-03-27 06:51:31.702969320 +0100
+++ /var/tmp/diff_new_pack.uE4z1D/_new  2026-03-27 06:51:31.706969486 +0100
@@ -51,7 +51,7 @@
 %define manpages translatetoolkit %binaries_and_manpages
 
 Name:           translate-toolkit%{psuffix}
-Version:        3.19.2
+Version:        3.19.3
 Release:        0
 Summary:        Tools and API to assist with translation and software 
localization
 License:        GPL-2.0-or-later

++++++ 3.19.2.tar.gz -> 3.19.3.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/.github/copilot-instructions.md 
new/translate-3.19.3/.github/copilot-instructions.md
--- old/translate-3.19.2/.github/copilot-instructions.md        2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/.github/copilot-instructions.md        1970-01-01 
01:00:00.000000000 +0100
@@ -1,28 +0,0 @@
-# Copilot Instructions for translate-toolkit
-
-# Existing documentation
-
-Follow existing contributor documentation in `docs/developers/`.
-
-## Code Style and Standards
-
-- Follow PEP 8 standards
-- Use type annotations
-
-### Linting and Formatting
-
-- Pre-commit hooks are configured (`.pre-commit-config.yaml`)
-- Use pylint for Python code quality
-- Follow existing code formatting patterns
-- Run `pre-commit run --all-files` before committing
-- Type check new code with `mypy`
-
-## Documentation
-
-- Document new feature in `docs/`
-
-### Dependencies
-
-- Manage dependencies in `pyproject.toml`
-- Use dependency groups for different environments (dev, test, docs, etc.)
-- Keep dependencies up to date with security considerations
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/.github/workflows/docs.yml 
new/translate-3.19.3/.github/workflows/docs.yml
--- old/translate-3.19.2/.github/workflows/docs.yml     2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/.github/workflows/docs.yml     2026-03-03 
17:06:26.000000000 +0100
@@ -22,7 +22,7 @@
       uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # 
v6.2.0
       with:
         python-version: 3.14
-    - uses: astral-sh/setup-uv@eac588ad8def6316056a12d4907a9d4d84ff7a3b # 
v7.3.0
+    - uses: astral-sh/setup-uv@5a095e7a2014a4212f075830d4f7277575a9d098 # 
v7.3.1
     - name: Install apt dependencies
       run: |
         sudo apt-get update
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/.github/workflows/pre-commit.yml 
new/translate-3.19.3/.github/workflows/pre-commit.yml
--- old/translate-3.19.2/.github/workflows/pre-commit.yml       2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/.github/workflows/pre-commit.yml       2026-03-03 
17:06:26.000000000 +0100
@@ -17,27 +17,23 @@
     - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
       with:
         persist-credentials: false
-    - name: Get cache tag
-      id: get-date
-      run: |
-        echo "cache_tag=$(/bin/date --utc '+%Y%m')" >> "$GITHUB_OUTPUT"
-        echo "previous_cache_tag=$(/bin/date --date='1 month ago' --utc 
'+%Y%m')" >> "$GITHUB_OUTPUT"
-      shell: bash
     - uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
-      id: pre-commit-cache
       with:
-        path: ~/.cache/pre-commit
-        key: ${{ runner.os }}-pre-commit-${{ steps.get-date.outputs.cache_tag 
}}-${{ hashFiles('.pre-commit-config.yaml') }}
+        path: ~/.cache/prek
+        key: ${{ runner.os }}-prek-${{ hashFiles('.pre-commit-config.yaml') }}
         restore-keys: |
-          ${{ runner.os }}-pre-commit-${{ steps.get-date.outputs.cache_tag }}
-          ${{ runner.os }}-pre-commit-${{ 
steps.get-date.outputs.previous_cache_tag }}
-          ${{ runner.os }}-pre-commit-
-    - uses: astral-sh/setup-uv@eac588ad8def6316056a12d4907a9d4d84ff7a3b # 
v7.3.0
+          ${{ runner.os }}-prek-
+    - uses: astral-sh/setup-uv@5a095e7a2014a4212f075830d4f7277575a9d098 # 
v7.3.1
     - uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # 
v6.2.0
       with:
         python-version: 3.14
-    - name: pre-commit
-      run: uv run --no-sources --only-group pre-commit pre-commit run --all
+    - name: prek
+      run: uv run --no-sources --only-group pre-commit prek run --all-files
       env:
         RUFF_OUTPUT_FORMAT: github
         REUSE_OUTPUT_FORMAT: github
+    - name: diff
+      run: git diff
+      if: always()
+    - name: prek-prune
+      run: uv run --no-sources --only-group pre-commit prek cache gc
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/.github/workflows/pylint.yml 
new/translate-3.19.3/.github/workflows/pylint.yml
--- old/translate-3.19.2/.github/workflows/pylint.yml   2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/.github/workflows/pylint.yml   2026-03-03 
17:06:26.000000000 +0100
@@ -26,11 +26,11 @@
       id: setup_python
       with:
         python-version: '3.14'
-    - uses: astral-sh/setup-uv@eac588ad8def6316056a12d4907a9d4d84ff7a3b # 
v7.3.0
+    - uses: astral-sh/setup-uv@5a095e7a2014a4212f075830d4f7277575a9d098 # 
v7.3.1
       with:
         save-cache: ${{ github.ref == 'refs/heads/master' }}
         cache-suffix: ${{ steps.setup_python.outputs.python-version }}
-        version: 0.10.6
+        version: 0.10.7
     - name: Install deps
       run: uv sync --all-extras --group pylint
     - name: Run pylint
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/.github/workflows/setup.yml 
new/translate-3.19.3/.github/workflows/setup.yml
--- old/translate-3.19.2/.github/workflows/setup.yml    2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/.github/workflows/setup.yml    2026-03-03 
17:06:26.000000000 +0100
@@ -24,7 +24,7 @@
       uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # 
v6.2.0
       with:
         python-version: 3.14
-    - uses: astral-sh/setup-uv@eac588ad8def6316056a12d4907a9d4d84ff7a3b # 
v7.3.0
+    - uses: astral-sh/setup-uv@5a095e7a2014a4212f075830d4f7277575a9d098 # 
v7.3.1
       with:
         enable-cache: false
     - name: Install apt dependencies
@@ -60,7 +60,7 @@
     - uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # 
v5.5.2
       with:
         name: setup
-    - uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # 
v6.0.0
+    - uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # 
v7.0.0
       with:
         path: ./dist/*
 
@@ -73,7 +73,7 @@
       # this permission is mandatory for trusted publishing
       id-token: write
     steps:
-    - uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 
# v7.0.0
+    - uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 
# v8.0.0
       with:
         # unpacks default artifact into dist/
         # if `name: artifact` is omitted, the action will create extra parent 
dir
@@ -92,7 +92,7 @@
       # this permission is mandatory for creating a release
       contents: write
     steps:
-    - uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 
# v7.0.0
+    - uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 
# v8.0.0
       with:
         # unpacks default artifact into dist/
         # if `name: artifact` is omitted, the action will create extra parent 
dir
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/.github/workflows/test-qemu.yml 
new/translate-3.19.3/.github/workflows/test-qemu.yml
--- old/translate-3.19.2/.github/workflows/test-qemu.yml        2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/.github/workflows/test-qemu.yml        2026-03-03 
17:06:26.000000000 +0100
@@ -31,7 +31,7 @@
     - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd   # 
v6.0.2
       with:
         persist-credentials: false
-    - uses: astral-sh/setup-uv@eac588ad8def6316056a12d4907a9d4d84ff7a3b # 
v7.3.0
+    - uses: astral-sh/setup-uv@5a095e7a2014a4212f075830d4f7277575a9d098 # 
v7.3.1
       with:
         cache-local-path: /tmp/setup-uv-cache
         cache-suffix: ${{ matrix.arch }}-${{ matrix.distro }}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/.github/workflows/test.yml 
new/translate-3.19.3/.github/workflows/test.yml
--- old/translate-3.19.2/.github/workflows/test.yml     2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/.github/workflows/test.yml     2026-03-03 
17:06:26.000000000 +0100
@@ -41,7 +41,7 @@
       with:
         allow-prereleases: true
         python-version: ${{ matrix.python-version }}
-    - uses: astral-sh/setup-uv@eac588ad8def6316056a12d4907a9d4d84ff7a3b # 
v7.3.0
+    - uses: astral-sh/setup-uv@5a095e7a2014a4212f075830d4f7277575a9d098 # 
v7.3.1
       with:
         cache-suffix: ${{ steps.setup_python.outputs.python-version }}
     - name: Install apt dependencies
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/.github/workflows/ty.yml 
new/translate-3.19.3/.github/workflows/ty.yml
--- old/translate-3.19.2/.github/workflows/ty.yml       2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/.github/workflows/ty.yml       2026-03-03 
17:06:26.000000000 +0100
@@ -20,7 +20,7 @@
       with:
         python-version: 3.14
 
-    - uses: astral-sh/setup-uv@eac588ad8def6316056a12d4907a9d4d84ff7a3b # 
v7.3.0
+    - uses: astral-sh/setup-uv@5a095e7a2014a4212f075830d4f7277575a9d098 # 
v7.3.1
       with:
         cache-suffix: ${{ steps.setup_python.outputs.python-version }}
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/.pre-commit-config.yaml 
new/translate-3.19.3/.pre-commit-config.yaml
--- old/translate-3.19.2/.pre-commit-config.yaml        2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/.pre-commit-config.yaml        2026-03-03 
17:06:26.000000000 +0100
@@ -13,13 +13,13 @@
   - id: check-hooks-apply
   - id: check-useless-excludes
 - repo: https://github.com/astral-sh/ruff-pre-commit
-  rev: v0.15.2
+  rev: v0.15.4
   hooks:
   - id: ruff-check
     args: [--fix, --exit-non-zero-on-fix]
   - id: ruff-format
 - repo: https://github.com/python-jsonschema/check-jsonschema
-  rev: 0.36.2
+  rev: 0.37.0
   hooks:
   - id: check-github-workflows
 - repo: https://github.com/macisamuele/language-formatters-pre-commit-hooks
@@ -44,7 +44,7 @@
   hooks:
   - id: zizmor
 - repo: https://github.com/crate-ci/typos
-  rev: v1.43.5
+  rev: v1.44.0
   hooks:
   - id: typos
     args: []
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/AGENTS.md 
new/translate-3.19.3/AGENTS.md
--- old/translate-3.19.2/AGENTS.md      1970-01-01 01:00:00.000000000 +0100
+++ new/translate-3.19.3/AGENTS.md      2026-03-03 17:06:26.000000000 +0100
@@ -0,0 +1,8 @@
+# Agents guidance for translate-toolkit
+
+# Testing and linting instructions
+
+- Use `pytest` to run the testsuite.
+- Use `prek` to lint code, it utilizes the `pre-commit` framework.
+- Use `pylint` to lint the Python code.
+- Use `ty` to type check the code.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/docs/conf.py 
new/translate-3.19.3/docs/conf.py
--- old/translate-3.19.2/docs/conf.py   2026-02-25 10:00:16.000000000 +0100
+++ new/translate-3.19.3/docs/conf.py   2026-03-03 17:06:26.000000000 +0100
@@ -23,7 +23,7 @@
 copyright = "Translate Toolkit authors"
 
 # The short X.Y version.
-version = "3.19.2"
+version = "3.19.3"
 
 # The full version, including alpha/beta/rc tags
 release = version
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/docs/releases/3.19.3.rst 
new/translate-3.19.3/docs/releases/3.19.3.rst
--- old/translate-3.19.2/docs/releases/3.19.3.rst       1970-01-01 
01:00:00.000000000 +0100
+++ new/translate-3.19.3/docs/releases/3.19.3.rst       2026-03-03 
17:06:26.000000000 +0100
@@ -0,0 +1,27 @@
+Translate Toolkit 3.19.3
+************************
+
+*Released on 3 March 2026*
+
+This release contains improvements and bug fixes.
+
+Changes
+=======
+
+Formats and Converters
+----------------------
+
+- Markdown
+
+  - Extract code blocks for translation (opt-out possible)
+
+- Apple strings
+
+  - Improve whitespace and comments preservation upon saving
+
+Contributors
+============
+
+This release was made possible by the following contributors:
+
+Michal Čihař, Copilot
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/docs/releases/index.rst 
new/translate-3.19.3/docs/releases/index.rst
--- old/translate-3.19.2/docs/releases/index.rst        2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/docs/releases/index.rst        2026-03-03 
17:06:26.000000000 +0100
@@ -17,6 +17,7 @@
 .. toctree::
    :maxdepth: 1
 
+   3.19.3 <3.19.3>
    3.19.2 <3.19.2>
    3.19.1 <3.19.1>
    3.19.0 <3.19.0>
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/pyproject.toml 
new/translate-3.19.3/pyproject.toml
--- old/translate-3.19.2/pyproject.toml 2026-02-25 10:00:16.000000000 +0100
+++ new/translate-3.19.3/pyproject.toml 2026-03-03 17:06:26.000000000 +0100
@@ -24,7 +24,7 @@
   {include-group = "pylint"}
 ]
 pre-commit = [
-  "pre-commit==4.5.1"
+  "prek==0.3.4"
 ]
 pylint = [
   "pylint==4.0.5"
@@ -37,7 +37,7 @@
 ]
 types = [
   "setuptools>=61.2",
-  "ty==0.0.18",
+  "ty==0.0.20",
   "types-lxml==2026.2.16"
 ]
 
@@ -226,6 +226,7 @@
   "*.yaml",
   "*.yml",
   ".well-known/funding-manifest-urls",
+  "AGENTS.md",
   "CREDITS",
   "docs/*",
   "docs/*/*",
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/translate-3.19.2/tests/translate/convert/test_md2po.py 
new/translate-3.19.3/tests/translate/convert/test_md2po.py
--- old/translate-3.19.2/tests/translate/convert/test_md2po.py  2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/tests/translate/convert/test_md2po.py  2026-03-03 
17:06:26.000000000 +0100
@@ -17,6 +17,7 @@
         "-t TEMPLATE, --template=TEMPLATE",
         "--duplicates=DUPLICATESTYLE",
         "--multifile=MULTIFILESTYLE",
+        "--no-code-blocks",
     ]
 
     def test_markdown_file_with_multifile_single(self) -> None:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/translate-3.19.2/tests/translate/convert/test_po2md.py 
new/translate-3.19.3/tests/translate/convert/test_po2md.py
--- old/translate-3.19.2/tests/translate/convert/test_po2md.py  2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/tests/translate/convert/test_po2md.py  2026-03-03 
17:06:26.000000000 +0100
@@ -19,6 +19,7 @@
         "--fuzzy",
         "--nofuzzy",
         "-m MAXLENGTH, --maxlinelength=MAXLENGTH",
+        "--no-code-blocks",
     ]
 
     def test_single_markdown_file_with_single_po(self) -> None:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/translate-3.19.2/tests/translate/storage/test_markdown.py 
new/translate-3.19.3/tests/translate/storage/test_markdown.py
--- old/translate-3.19.2/tests/translate/storage/test_markdown.py       
2026-02-25 10:00:16.000000000 +0100
+++ new/translate-3.19.3/tests/translate/storage/test_markdown.py       
2026-03-03 17:06:26.000000000 +0100
@@ -217,6 +217,61 @@
         ]
         store = self.parse("".join(input))
         unit_sources = self.get_translation_unit_sources(store)
+        assert unit_sources == ["a simple\n  indented code block"]
+        translated_output = self.get_translated_output(store)
+        assert translated_output == "    (a simple\n      indented code 
block)\n"
+
+    def test_code_block_not_extracted(self) -> None:
+        input = [
+            "    a simple\n",
+            "      indented code block\n",
+        ]
+        inputfile = BytesIO("".join(input).encode())
+        store = markdown.MarkdownFile(
+            inputfile=inputfile,
+            callback=lambda x: f"({x})",
+            extract_code_blocks=False,
+        )
+        unit_sources = self.get_translation_unit_sources(store)
+        assert unit_sources == []
+
+    def test_fenced_code_block(self) -> None:
+        input = [
+            "```\n",
+            "fenced code block\n",
+            "```\n",
+        ]
+        store = self.parse("".join(input))
+        unit_sources = self.get_translation_unit_sources(store)
+        assert unit_sources == ["fenced code block"]
+        translated_output = self.get_translated_output(store)
+        assert translated_output == "```\n(fenced code block)\n```\n"
+
+    def test_fenced_code_block_with_language(self) -> None:
+        input = [
+            "```python\n",
+            "print('hello')\n",
+            "```\n",
+        ]
+        store = self.parse("".join(input))
+        unit_sources = self.get_translation_unit_sources(store)
+        assert unit_sources == ["print('hello')"]
+        translated_output = self.get_translated_output(store)
+        assert translated_output == "```python\n(print('hello'))\n```\n"
+
+    def test_fenced_code_block_not_extracted(self) -> None:
+        input = [
+            "```python\n",
+            "print('hello')\n",
+            "```\n",
+        ]
+        inputfile = BytesIO("".join(input).encode())
+        store = markdown.MarkdownFile(
+            inputfile=inputfile,
+            callback=lambda x: f"({x})",
+            extract_code_blocks=False,
+        )
+        unit_sources = self.get_translation_unit_sources(store)
         assert unit_sources == []
 
     def test_html_block(self) -> None:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/translate-3.19.2/tests/translate/storage/test_properties.py 
new/translate-3.19.3/tests/translate/storage/test_properties.py
--- old/translate-3.19.2/tests/translate/storage/test_properties.py     
2026-02-25 10:00:16.000000000 +0100
+++ new/translate-3.19.3/tests/translate/storage/test_properties.py     
2026-03-03 17:06:26.000000000 +0100
@@ -693,15 +693,21 @@
         assert propunit.getnotes() == "Foo\nBar\nBaz"
 
     def test_mac_strings_comments_dropping(self) -> None:
-        """.string generic (and unuseful) comments should be dropped."""
+        """.string generic (and unuseful) comments should be hidden from notes 
but preserved in round-trip."""
         propsource = """/* No comment provided by engineer. */
-"key" = "value";""".encode("utf-16")
+"key" = "value";
+""".encode("utf-16")
         propfile = self.propparse(propsource, personality="strings")
         assert len(propfile.units) == 1
         propunit = propfile.units[0]
         assert propunit.name == "key"
         assert propunit.source == "value"
+        # Comment should not be exposed via getnotes()
         assert propunit.getnotes() == ""
+        # But it should be preserved in round-trip
+        result = bytes(propfile).decode("utf-16")
+        expected = '/* No comment provided by engineer. */\n"key" = "value";\n'
+        assert result == expected
 
     def test_mac_strings_inline_comments(self) -> None:
         """Test .strings inline comments are parsed correctly."""
@@ -1134,6 +1140,102 @@
         assert propunit.value == "value"
         assert propunit.getnotes() == "long\nnote"
 
+    def test_mac_strings_no_comment_round_trip(self) -> None:
+        """Test that 'No comment provided by engineer.' is preserved in 
round-trip."""
+        propsource = (
+            '/* No comment provided by engineer. */\n"key1" = "value1";\n'
+            "\n"
+            '/* No comment provided by engineer. */\n"key2" = "value2";\n'
+        ).encode("utf-16")
+        propfile = self.propparse(propsource, personality="strings")
+        assert len(propfile.units) == 2
+        # Comments should not be exposed via getnotes()
+        assert propfile.units[0].getnotes() == ""
+        assert propfile.units[1].getnotes() == ""
+        # But the full file should round-trip correctly
+        result = bytes(propfile).decode("utf-16")
+        expected = (
+            '/* No comment provided by engineer. */\n"key1" = "value1";\n'
+            "\n"
+            '/* No comment provided by engineer. */\n"key2" = "value2";\n'
+        )
+        assert result == expected
+
+    def test_mac_strings_blank_lines_preserved(self) -> None:
+        """Test that blank lines between entries are preserved in 
round-trip."""
+        propsource = (
+            '/* comment1 */\n"key1" = "value1";\n\n/* comment2 */\n"key2" = 
"value2";\n'
+        ).encode("utf-16")
+        propfile = self.propparse(propsource, personality="strings")
+        assert len(propfile.units) == 2
+        result = bytes(propfile).decode("utf-16")
+        expected = (
+            '/* comment1 */\n"key1" = "value1";\n\n/* comment2 */\n"key2" = 
"value2";\n'
+        )
+        assert result == expected
+
+    def test_mac_strings_single_blank_line_preserved(self) -> None:
+        """Test that a single blank line between entries without comments is 
preserved."""
+        propsource = '"key1" = "value1";\n\n"key2" = 
"value2";\n'.encode("utf-16")
+        propfile = self.propparse(propsource, personality="strings")
+        assert len(propfile.units) == 2
+        result = bytes(propfile).decode("utf-16")
+        expected = '"key1" = "value1";\n\n"key2" = "value2";\n'
+        assert result == expected
+
+    def test_mac_strings_multiple_blank_lines_preserved(self) -> None:
+        """Test that multiple blank lines between entries are preserved."""
+        propsource = '"key1" = "value1";\n\n\n"key2" = 
"value2";\n'.encode("utf-16")
+        propfile = self.propparse(propsource, personality="strings")
+        assert len(propfile.units) == 2
+        result = bytes(propfile).decode("utf-16")
+        expected = '"key1" = "value1";\n\n\n"key2" = "value2";\n'
+        assert result == expected
+
+    def test_mac_strings_realistic_round_trip(self) -> None:
+        """Test round-trip of a realistic .strings file with mixed comments 
and blank lines."""
+        propsource = (
+            '/* No comment provided by engineer. */\n"key1" = "value1";\n'
+            "\n"
+            '/* A real comment */\n"key2" = "value2";\n'
+            "\n"
+            '/* No comment provided by engineer. */\n"key3" = "value3";\n'
+        ).encode("utf-16")
+        propfile = self.propparse(propsource, personality="strings")
+        assert len(propfile.units) == 3
+        # Only the real comment should be exposed via getnotes()
+        assert propfile.units[0].getnotes() == ""
+        assert propfile.units[1].getnotes() == "A real comment"
+        assert propfile.units[2].getnotes() == ""
+        # Full round-trip should be preserved
+        result = bytes(propfile).decode("utf-16")
+        expected = (
+            '/* No comment provided by engineer. */\n"key1" = "value1";\n'
+            "\n"
+            '/* A real comment */\n"key2" = "value2";\n'
+            "\n"
+            '/* No comment provided by engineer. */\n"key3" = "value3";\n'
+        )
+        assert result == expected
+
+    def test_mac_strings_no_comment_with_whitespace(self) -> None:
+        """Test that 'No comment provided by engineer.' is hidden even with 
extra whitespace."""
+        propsource = (
+            '  /* No comment provided by engineer. */  \n"key" = "value";\n'
+        ).encode("utf-16")
+        propfile = self.propparse(propsource, personality="strings")
+        assert len(propfile.units) == 1
+        # Comment should not be exposed via getnotes() even with whitespace
+        assert propfile.units[0].getnotes() == ""
+
+    def test_mac_strings_multiline_comment_blank_lines(self) -> None:
+        """Test that blank lines inside multi-line comments are preserved in 
notes."""
+        propsource = '/* Foo\n\nBar */\n"key" = "value";\n'.encode("utf-16")
+        propfile = self.propparse(propsource, personality="strings")
+        assert len(propfile.units) == 1
+        # Blank line inside multi-line comment should be preserved in notes
+        assert propfile.units[0].getnotes() == "Foo\n\nBar"
+
     def test_trailing_newlines(self) -> None:
         """Ensure we can handle Unicode."""
         propsource = """"I am a “key”" = "I am a “value”";\n"""
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/translate/__version__.py 
new/translate-3.19.3/translate/__version__.py
--- old/translate-3.19.2/translate/__version__.py       2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/translate/__version__.py       2026-03-03 
17:06:26.000000000 +0100
@@ -18,7 +18,7 @@
 
 """This file contains the version of the Translate Toolkit."""
 
-ver = (3, 19, 2)
+ver = (3, 19, 3)
 """Machine readable version number. Used by tools that need to adjust code
 paths based on a Translate Toolkit release number."""
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/translate/convert/md2po.py 
new/translate-3.19.3/translate/convert/md2po.py
--- old/translate-3.19.2/translate/convert/md2po.py     2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/translate/convert/md2po.py     2026-03-03 
17:06:26.000000000 +0100
@@ -24,6 +24,10 @@
 for examples and usage instructions.
 """
 
+from __future__ import annotations
+
+import functools
+
 from translate.convert import convert
 from translate.storage import markdown, po
 
@@ -40,46 +44,84 @@
         super().__init__(formats, usetemplates=True, usepots=True, 
description=__doc__)
         self.add_duplicates_option()
         self.add_multifile_option()
+        self.add_option(
+            "",
+            "--no-code-blocks",
+            action="store_false",
+            dest="extract_code_blocks",
+            default=True,
+            help="do not extract code blocks for translation",
+        )
+        self.passthrough.append("extract_code_blocks")
 
     def _extract_translation_units(
         self,
         inputfile,
         outputfile,
         templatefile,
-        duplicatestyle,
-        multifilestyle,
+        duplicatestyle: str,
+        multifilestyle: str,
+        extract_code_blocks: bool = True,
     ) -> int:
         if hasattr(self, "outputstore"):
             if templatefile is None:
-                self._parse_and_extract(inputfile, self.outputstore)
+                self._parse_and_extract(
+                    inputfile, self.outputstore, 
extract_code_blocks=extract_code_blocks
+                )
             else:
-                self._merge_with_template(inputfile, templatefile, 
self.outputstore)
+                self._merge_with_template(
+                    inputfile,
+                    templatefile,
+                    self.outputstore,
+                    extract_code_blocks=extract_code_blocks,
+                )
         else:
             store = po.pofile()
             if templatefile is None:
-                self._parse_and_extract(inputfile, store)
+                self._parse_and_extract(
+                    inputfile, store, extract_code_blocks=extract_code_blocks
+                )
             else:
-                self._merge_with_template(inputfile, templatefile, store)
+                self._merge_with_template(
+                    inputfile,
+                    templatefile,
+                    store,
+                    extract_code_blocks=extract_code_blocks,
+                )
             store.removeduplicates(duplicatestyle)
             store.serialize(outputfile)
         return 1
 
     @staticmethod
-    def _parse_and_extract(inputfile, outputstore) -> None:
+    def _parse_and_extract(
+        inputfile, outputstore: po.pofile, *, extract_code_blocks: bool = True
+    ) -> None:
         """Extract translation units from a markdown file and add them to an 
existing message store (pofile object) without any further processing."""
-        parser = markdown.MarkdownFile(inputfile=inputfile)
+        parser = markdown.MarkdownFile(
+            inputfile=inputfile, extract_code_blocks=extract_code_blocks
+        )
         for tu in parser.units:
             if not tu.isheader():
                 storeunit = outputstore.addsourceunit(tu.source)
                 storeunit.addlocations(tu.getlocations())
 
-    def _merge_with_template(self, inputfile, templatefile, outputstore) -> 
None:
+    def _merge_with_template(
+        self,
+        inputfile,
+        templatefile,
+        outputstore: po.pofile,
+        *,
+        extract_code_blocks: bool = True,
+    ) -> None:
         """Merge translation from inputfile with source from templatefile 
using docpath matching."""
+        store_class = functools.partial(
+            markdown.MarkdownFile, extract_code_blocks=extract_code_blocks
+        )
         self.merge_stores_by_docpath(
             inputfile,
             templatefile,
             outputstore,
-            markdown.MarkdownFile,
+            store_class,
             filter_header=True,
         )
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/translate/convert/po2md.py 
new/translate-3.19.3/translate/convert/po2md.py
--- old/translate-3.19.2/translate/convert/po2md.py     2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/translate/convert/po2md.py     2026-03-03 
17:06:26.000000000 +0100
@@ -23,6 +23,8 @@
 for examples and usage instructions.
 """
 
+from __future__ import annotations
+
 import os
 import sys
 
@@ -34,12 +36,20 @@
 
 
 class MarkdownTranslator:
-    def __init__(self, inputstore, includefuzzy, outputthreshold, maxlength) 
-> None:
+    def __init__(
+        self,
+        inputstore: po.pofile,
+        includefuzzy: bool,
+        outputthreshold: int | None,
+        maxlength: int,
+        extract_code_blocks: bool = True,
+    ) -> None:
         self.inputstore = inputstore
         self.inputstore.require_index()
         self.includefuzzy = includefuzzy
         self.outputthreshold = outputthreshold
         self.maxlength = maxlength
+        self.extract_code_blocks = extract_code_blocks
 
     def translate(self, templatefile, outputfile) -> int:
         if not convert.should_output_store(self.inputstore, 
self.outputthreshold):
@@ -49,11 +59,12 @@
             inputfile=templatefile,
             callback=self._lookup,
             max_line_length=self.maxlength if self.maxlength > 0 else None,
+            extract_code_blocks=self.extract_code_blocks,
         )
         outputfile.write(outputstore.filesrc.encode("utf-8"))
         return 1
 
-    def _lookup(self, string):
+    def _lookup(self, string: str) -> str:
         unit = self.inputstore.sourceindex.get(string, None)
         if unit is None:
             return string
@@ -83,6 +94,15 @@
             help="reflow (word wrap) the output to the given maximum line 
length. set to 0 to disable",
         )
         self.passthrough.append("maxlength")
+        self.add_option(
+            "",
+            "--no-code-blocks",
+            action="store_false",
+            dest="extract_code_blocks",
+            default=True,
+            help="do not extract code blocks for translation",
+        )
+        self.passthrough.append("extract_code_blocks")
         self.add_threshold_option()
         self.add_fuzzy_option()
 
@@ -91,13 +111,18 @@
         inputfile,
         outputfile,
         templatefile,
-        includefuzzy,
-        outputthreshold,
-        maxlength,
+        includefuzzy: bool,
+        outputthreshold: int | None,
+        maxlength: int,
+        extract_code_blocks: bool = True,
     ):
         inputstore = po.pofile(inputfile)
         translator = MarkdownTranslator(
-            inputstore, includefuzzy, outputthreshold, maxlength
+            inputstore,
+            includefuzzy,
+            outputthreshold,
+            maxlength,
+            extract_code_blocks=extract_code_blocks,
         )
         return translator.translate(templatefile, outputfile)
 
@@ -149,12 +174,17 @@
         inputfile,
         outputfile,
         templatefile,
-        includefuzzy,
-        outputthreshold,
-        maxlength,
+        includefuzzy: bool,
+        outputthreshold: int | None,
+        maxlength: int,
+        extract_code_blocks: bool = True,
     ):
         translator = MarkdownTranslator(
-            self.inputstore, includefuzzy, outputthreshold, maxlength
+            self.inputstore,
+            includefuzzy,
+            outputthreshold,
+            maxlength,
+            extract_code_blocks=extract_code_blocks,
         )
         return translator.translate(templatefile, outputfile)
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/translate/convert/prop2mozfunny.py 
new/translate-3.19.3/translate/convert/prop2mozfunny.py
--- old/translate-3.19.2/translate/convert/prop2mozfunny.py     2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/translate/convert/prop2mozfunny.py     2026-03-03 
17:06:26.000000000 +0100
@@ -40,12 +40,11 @@
         if unit.isblank():
             pendingblanks.append("\n")
         else:
-            definition = "#define {} {}\n".format(
+            yield from pendingblanks
+            yield "#define {} {}\n".format(
                 unit.name,
                 unit.value.replace("\n", "\\n"),
             )
-            yield from pendingblanks
-            yield definition
 
 
 def prop2it(pf):
@@ -64,8 +63,7 @@
         if unit.isblank():
             yield ""
         else:
-            definition = f"{unit.name}={unit.value}\n"
-            yield definition
+            yield f"{unit.name}={unit.value}\n"
 
 
 def prop2funny(src, itencoding="cp1252"):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/translate/storage/markdown.py 
new/translate-3.19.3/translate/storage/markdown.py
--- old/translate-3.19.2/translate/storage/markdown.py  2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/translate/storage/markdown.py  2026-03-03 
17:06:26.000000000 +0100
@@ -79,7 +79,13 @@
 class MarkdownFile(base.TranslationStore[MarkdownUnit]):
     UnitClass = MarkdownUnit
 
-    def __init__(self, inputfile=None, callback=None, max_line_length=None) -> 
None:
+    def __init__(
+        self,
+        inputfile=None,
+        callback=None,
+        max_line_length=None,
+        extract_code_blocks=True,
+    ) -> None:
         """
         Construct a new object instance.
 
@@ -89,11 +95,14 @@
           a no-op.
         :param max_line_length: if specified, the document is word wrapped to 
the
           given line length when rendered.
+        :param extract_code_blocks: if True (default), code blocks are 
extracted
+          for translation. If False, code blocks are left as-is.
         """
         base.TranslationStore.__init__(self)
         self.filename = getattr(inputfile, "name", None)
         self.callback = callback or self._dummy_callback
         self.max_line_length = max_line_length
+        self.extract_code_blocks = extract_code_blocks
         self.filesrc = ""
         if inputfile is not None:
             md_src = inputfile.read()
@@ -133,6 +142,7 @@
             self._translate_callback,
             block_token.Table,
             max_line_length=self.max_line_length,
+            extract_code_blocks=self.extract_code_blocks,
         ) as renderer:
             document = block_token.Document(lines)
             self.filesrc = front_matter + renderer.render(document)
@@ -163,12 +173,14 @@
         translate_callback: Callable[[str, list[str], str], str],
         *extras,
         max_line_length: int | None = None,
+        extract_code_blocks: bool = True,
     ) -> None:
         super().__init__(*extras, max_line_length=max_line_length)  # 
ty:ignore[invalid-argument-type]
         self.translate_callback = translate_callback
         self.bypass = False
         self.path = []
         self.ignore_translation = False
+        self.extract_code_blocks = extract_code_blocks
         # Docpath tracking: heading hierarchy and sibling counts
         # _heading_stack: list of (level, heading_index) for current heading 
nesting
         self._heading_stack: list[tuple[int, int]] = []
@@ -428,6 +440,46 @@
         # Return the raw HTML block content
         return super().render_html_block(token, 
max_line_length=max_line_length)
 
+    def render_block_code(
+        self, token: block_token.BlockCode, max_line_length: int
+    ) -> Iterable[str]:
+        if not self.extract_code_blocks or self.ignore_translation:
+            return super().render_block_code(token, 
max_line_length=max_line_length)
+
+        self.path.append(f":{token.line_number}")  # 
ty:ignore[unresolved-attribute]
+        self._current_docpath = self._build_docpath("code")
+        code_content = token.content[:-1]  # strip trailing \n
+        translated = self.translate_callback(
+            code_content, self.path, self._current_docpath
+        )
+        lines = translated.split("\n")
+        self.path.pop()
+        return self.prefix_lines(lines, "    ")
+
+    def render_fenced_code_block(  # ty:ignore[invalid-method-override]
+        self,
+        token: block_token.CodeFence,
+        max_line_length: int,
+    ) -> Iterable[str]:
+        if not self.extract_code_blocks or self.ignore_translation:
+            return super().render_fenced_code_block(
+                token,  # ty:ignore[invalid-argument-type]
+                max_line_length=max_line_length,
+            )
+
+        self.path.append(f":{token.line_number}")  # 
ty:ignore[unresolved-attribute]
+        self._current_docpath = self._build_docpath("code")
+        code_content = token.content[:-1]  # strip trailing \n
+        translated = self.translate_callback(
+            code_content, self.path, self._current_docpath
+        )
+        indentation = " " * token.indentation
+        result = [indentation + token.delimiter + token.info_string]
+        result.extend(self.prefix_lines(translated.split("\n"), indentation))
+        result.append(indentation + token.delimiter)
+        self.path.pop()
+        return result
+
     def render_list_item(
         self, token: block_token.ListItem, max_line_length: int
     ) -> Iterable[str]:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/translate-3.19.2/translate/storage/properties.py 
new/translate-3.19.3/translate/storage/properties.py
--- old/translate-3.19.2/translate/storage/properties.py        2026-02-25 
10:00:16.000000000 +0100
+++ new/translate-3.19.3/translate/storage/properties.py        2026-03-03 
17:06:26.000000000 +0100
@@ -267,6 +267,8 @@
     key_wrap_char: str = ""
     value_wrap_char: str = ""
     drop_comments: list[str] = []
+    hidden_comments: list[str] = []
+    preserve_blank_lines: bool = False
     has_plurals: bool = False
 
     @staticmethod
@@ -544,7 +546,9 @@
     value_wrap_char = '"'
     out_ending = ";"
     out_delimiter_wrappers = " "
-    drop_comments = ["/* No comment provided by engineer. */"]
+    drop_comments = []
+    hidden_comments = ["/* No comment provided by engineer. */"]
+    preserve_blank_lines = True
     encode_trans = str.maketrans(
         {
             "\\": "\\\\",
@@ -1008,7 +1012,7 @@
     def getoutput(self):
         """Convert the element back into formatted lines for a .properties 
file."""
         notes = "\n".join(self.comments)
-        if notes:
+        if notes or (self.comments and self.personality.preserve_blank_lines):
             notes = f"{notes}\n"
         if self.isblank():
             return notes or "\n"
@@ -1045,9 +1049,15 @@
 
     def getnotes(self, origin=None):
         if origin in {"programmer", "developer", "source code", None}:
+            hidden = {c.strip() for c in self.personality.hidden_comments}
+            skip_blank = self.personality.preserve_blank_lines
             output = []
             inmultilinecomment = False
             for line in self.comments:
+                if line.strip() in hidden or (
+                    skip_blank and not line and not inmultilinecomment
+                ):
+                    continue
                 if (
                     not inmultilinecomment
                     and (parsed := get_comment_one_line(line)) is not None

Reply via email to