Xqt has submitted this change. (
https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1212647?usp=email )
Change subject: cleanup: Remove deprecated GeneratorsMixin.alllinks, keep RST
docs
......................................................................
cleanup: Remove deprecated GeneratorsMixin.alllinks, keep RST docs
- The GeneratorsMixin.alllink method was deprecated in Pywikibot 10.7
and removed in Pywikibot 11 because it was dysfunctional
- Added a manual RST entry under GeneratorsMixin to preserve historical API
documentation due to the short deprecation time
Bug: T407708
Change-Id: Ib1c01744758148c1f09e42b3c2a2fbf0ebb2f051
---
M ROADMAP.rst
M docs/api_ref/pywikibot.site.rst
M pywikibot/site/_generators.py
3 files changed, 54 insertions(+), 108 deletions(-)
Approvals:
Xqt: Verified; Looks good to me, approved
diff --git a/ROADMAP.rst b/ROADMAP.rst
index b247c7d..2fbbe04 100644
--- a/ROADMAP.rst
+++ b/ROADMAP.rst
@@ -3,6 +3,7 @@
**Improvements**
+* Provide a security policy with Pywikibot (:phab:`T410753`)
* Show a friendly install message with :mod:`pwb<pywikibot.scripts.wrapper>`
wrapper
when mandatory packages are missing (:phab:`T409662`)
* Update `tools._unidata.__category_cf` dict for
:func:`tools.chars.contains_invisible` and
@@ -17,6 +18,9 @@
**Code cleanups**
+* Dysfunctional :meth:`APISite.alllinks()
+ <pywikibot.site._generators.GeneratorsMixin.alllinks>` was removed.
+ (:phab:`T359427`, :phab:`T407708`)
* The inheritance of the :exc:`exceptions.NoSiteLinkError` exception from
:exc:`exceptions.NoPageError` was removed
* The *dropdelay* and *releasepid* attributes of the
:class:`throttle.Throttle` class was
@@ -84,14 +88,6 @@
removed in in the third subsequent major release, remaining available for the
two releases in between.
-Pending removal in Pywikibot 11
--------------------------------
-
-* 10.7.0: Dysfunctional :meth:`APISite.alllinks()
- <pywikibot.site._generators.GeneratorsMixin.alllinks>` will be removed.
- (:phab:`T359427`, :phab:`T407708`)
-
-
Pending removal in Pywikibot 12
-------------------------------
diff --git a/docs/api_ref/pywikibot.site.rst b/docs/api_ref/pywikibot.site.rst
index 220a568..6fe32e4 100644
--- a/docs/api_ref/pywikibot.site.rst
+++ b/docs/api_ref/pywikibot.site.rst
@@ -86,6 +86,55 @@
.. automodule:: pywikibot.site._generators
:synopsis: Objects representing API generators to MediaWiki site
+.. currentmodule:: pywikibot.site._generators
+
+.. method:: GeneratorsMixin.alllinks(start='', prefix='', namespace=0,
unique=False, fromids=False, total=None)
+
+ Iterate all links to pages (which need not exist) in one namespace.
+
+ .. note::
+ In practice, links that were found on pages that have
+ been deleted may not have been removed from the links table,
+ so this method can return false positives.
+
+ .. caution::
+ *unique* parameter is no longer supported by
+ MediaWiki 1.43 or higher. Pywikibot uses
+ :func:`tools.itertools.filter_unique` in that case which
+ might be memory intensive. Use it with care.
+
+ .. important::
+ Using *namespace* option different from ``0``
+ needs a lot of time on Wikidata site.
+
+ .. deprecated:: 10.7
+ This method is dysfunctional and should no longer be used. It
+ will be removed in Pywikibot 11.
+
+ .. versionremoved:: 11.0
+ This method was dysfunctional and removed, see the following tickets for
details:
+ :phab:`T359427`, :phab:`T364617` and :phab:`T407708`
+
+ .. seealso::
+ - :api:`Alllinks`
+ - :meth:`pagebacklinks`
+ - :meth:`pagelinks`
+
+ :param start: Start at this title (page need not exist).
+ :param prefix: Only yield pages starting with this string.
+ :param namespace: Iterate pages from this (single) namespace
+ :param unique: If True, only iterate each link title once
+ (default: False)
+ :param fromids: if True, include the pageid of the page
+ containing each link (default: False) as the '_fromid'
+ attribute of the Page; cannot be combined with *unique*
+ :param total: Limit the total number of items
+ :raises KeyError: the *namespace* identifier was not resolved
+ :raises TypeError: the *namespace* identifier has an
+ inappropriate type such as bool, or an iterable with more
+ than one namespace
+ :rtype: Generator[pywikibot.Page]
+
:mod:`DataSite<pywikibot.site.\_datasite>` --- API Interface for Wikibase
=========================================================================
diff --git a/pywikibot/site/_generators.py b/pywikibot/site/_generators.py
index e62ff66..8b15ad1 100644
--- a/pywikibot/site/_generators.py
+++ b/pywikibot/site/_generators.py
@@ -27,12 +27,7 @@
)
from pywikibot.site._decorators import need_right
from pywikibot.site._namespace import NamespaceArgType
-from pywikibot.tools import (
- deprecate_arg,
- deprecated,
- deprecated_signature,
- is_ip_address,
-)
+from pywikibot.tools import deprecate_arg, deprecated_signature, is_ip_address
from pywikibot.tools.itertools import filter_unique
@@ -1013,100 +1008,6 @@
return apgen
- @deprecated(since='10.7.0')
- def alllinks(
- self,
- start: str = '',
- prefix: str = '',
- namespace: SingleNamespaceType = 0,
- unique: bool = False,
- fromids: bool = False,
- total: int | None = None,
- ) -> Generator[pywikibot.Page]:
- """Iterate all links to pages (which need not exist) in one namespace.
-
- .. note:: In practice, links that were found on pages that have
- been deleted may not have been removed from the links table,
- so this method can return false positives.
-
- .. caution:: *unique* parameter is no longer supported by
- MediaWiki 1.43 or higher. Pywikibot uses
- :func:`tools.itertools.filter_unique` in that case which
- might be memory intensive. Use it with care.
-
- .. important:: Using *namespace* option different from ``0``
- needs a lot of time on Wikidata site. You have to increase
- the **read** timeout part of ``socket_timeout`` in
- :ref:`Http Settings` in your ``user-config.py`` file. Or
- increase it partially within your code like:
-
- .. code-block:: python
-
- from pywikibot import config
- save_timeout = config.socket_timeout # save the timeout config
- config.socket_timeout = save_timeout[0], 60
- ... # your code here
- config.socket_timeout = save_timeout # restore timeout config
-
- The minimum read timeout value should be 60 seconds in that
- case.
-
- .. deprecated:: 10.7
- This method is dysfunctional and should no longer be used. It
- will probably be removed in Pywikibot 11.
-
- .. seealso::
- - :api:`Alllinks`
- - :meth:`pagebacklinks`
- - :meth:`pagelinks`
-
- :param start: Start at this title (page need not exist).
- :param prefix: Only yield pages starting with this string.
- :param namespace: Iterate pages from this (single) namespace
- :param unique: If True, only iterate each link title once
- (default: False)
- :param fromids: if True, include the pageid of the page
- containing each link (default: False) as the '_fromid'
- attribute of the Page; cannot be combined with *unique*
- :raises KeyError: the *namespace* identifier was not resolved
- :raises TypeError: the *namespace* identifier has an
- inappropriate type such as bool, or an iterable with more
- than one namespace
- """
- # no cover: start
- if unique and fromids:
- raise Error('alllinks: unique and fromids cannot both be True.')
- algen = self._generator(api.ListGenerator, type_arg='alllinks',
- namespaces=namespace, total=total)
- if fromids:
- algen.request['alprop'] = 'title|ids'
-
- # circumvent problems with unique and prefix due to T359425 and T359427
- if self.mw_version < '1.43':
- if prefix:
- algen.request['alprefix'] = prefix
- prefix = '' # don't break the loop later
- if start:
- algen.request['alfrom'] = start
- algen.request['alunique'] = unique
- else:
- if prefix:
- algen.request['alfrom'] = prefix
- elif start:
- algen.request['alfrom'] = start
- if unique:
- algen = filter_unique(
- algen, key=lambda link: (link['title'], link['ns']))
-
- for link in algen:
- p = pywikibot.Page(self, link['title'], link['ns'])
- if prefix and p.title() > prefix: # T359425, T359427
- break
- if fromids:
- p._fromid = link['fromid'] # type: ignore[attr-defined]
- yield p
- # no cover: stop
-
def allcategories(
self,
start: str = '!',
--
To view, visit
https://gerrit.wikimedia.org/r/c/pywikibot/core/+/1212647?usp=email
To unsubscribe, or for help writing mail filters, visit
https://gerrit.wikimedia.org/r/settings?usp=email
Gerrit-MessageType: merged
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Change-Id: Ib1c01744758148c1f09e42b3c2a2fbf0ebb2f051
Gerrit-Change-Number: 1212647
Gerrit-PatchSet: 1
Gerrit-Owner: Xqt <[email protected]>
Gerrit-Reviewer: Xqt <[email protected]>
Gerrit-Reviewer: jenkins-bot
_______________________________________________
Pywikibot-commits mailing list -- [email protected]
To unsubscribe send an email to [email protected]