Hello community,
here is the log from the commit of package python-sphinxcontrib-httpdomain for
openSUSE:Factory checked in at 2014-09-17 17:26:59
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-sphinxcontrib-httpdomain (Old)
and /work/SRC/openSUSE:Factory/.python-sphinxcontrib-httpdomain.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-sphinxcontrib-httpdomain"
Changes:
--------
---
/work/SRC/openSUSE:Factory/python-sphinxcontrib-httpdomain/python-sphinxcontrib-httpdomain.changes
2013-10-21 20:02:16.000000000 +0200
+++
/work/SRC/openSUSE:Factory/.python-sphinxcontrib-httpdomain.new/python-sphinxcontrib-httpdomain.changes
2014-09-17 17:27:29.000000000 +0200
@@ -1,0 +2,33 @@
+Tue Sep 16 06:36:36 UTC 2014 - [email protected]
+
+- update to version 1.3.0:
+ * jsonparameter/jsonparam/json became deprecated and split into
+ reqjsonobj/reqjson/<jsonobj/<json and reqjsonarr/<jsonarr.
+ [issue #55, pull request #72 by Alexander Shorin]
+ * Support synopsis (short description in HTTP index), deprecation
+ and noindex options for resources.
+ [issue #55, pull request #72 by Alexander Shorin]
+ * Stabilize order of index items.
+ [issue #55, pull request #72 by Alexander Shorin]
+ * Added :rst:directive:`http:any` directive and
+ http:any role for ANY method.
+ [issue #55, pull request #72 by Alexander Shorin]
+ * Added :rst:directive:`http:copy` directive and http:copy role
+ for COPY method.
+ [issue #55, pull request #72 by Alexander Shorin]
+ * Added http:header role that also creates reference to the
+ related specification.
+ [issue #55, pull request #72 by Alexander Shorin]
+ * http:statuscode role became to provide references to
+ specification sections.
+ [issue #55, pull request #72 by Alexander Shorin]
+ * Fixed Python 3 incompatibility of autohttp.tornado.
+ [pull request #61 by Dave Shawley]
+
+-------------------------------------------------------------------
+Tue Jun 24 08:18:04 UTC 2014 - [email protected]
+
+- update to 1.2.1:
+ * Six support
+
+-------------------------------------------------------------------
Old:
----
sphinxcontrib-httpdomain-1.2.0.tar.gz
New:
----
sphinxcontrib-httpdomain-1.3.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-sphinxcontrib-httpdomain.spec ++++++
--- /var/tmp/diff_new_pack.wqP2uM/_old 2014-09-17 17:27:30.000000000 +0200
+++ /var/tmp/diff_new_pack.wqP2uM/_new 2014-09-17 17:27:30.000000000 +0200
@@ -1,7 +1,7 @@
#
# spec file for package python-sphinxcontrib-httpdomain
#
-# Copyright (c) 2013 SUSE LINUX Products GmbH, Nuernberg, Germany.
+# Copyright (c) 2014 SUSE LINUX Products GmbH, Nuernberg, Germany.
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -17,7 +17,7 @@
Name: python-sphinxcontrib-httpdomain
-Version: 1.2.0
+Version: 1.3.0
Release: 0
Summary: Sphinx domain for HTTP APIs
License: BSD-2-Clause
@@ -27,6 +27,7 @@
BuildRequires: python-devel
BuildRequires: python-setuptools
Requires: python-Sphinx
+Requires: python-six
BuildRoot: %{_tmppath}/%{name}-%{version}-build
%if 0%{?suse_version} && 0%{?suse_version} <= 1110
%{!?python_sitelib: %global python_sitelib %(python -c "from
distutils.sysconfig import get_python_lib; print get_python_lib()")}
++++++ sphinxcontrib-httpdomain-1.2.0.tar.gz ->
sphinxcontrib-httpdomain-1.3.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/sphinxcontrib-httpdomain-1.2.0/PKG-INFO
new/sphinxcontrib-httpdomain-1.3.0/PKG-INFO
--- old/sphinxcontrib-httpdomain-1.2.0/PKG-INFO 2013-10-19 09:45:53.000000000
+0200
+++ new/sphinxcontrib-httpdomain-1.3.0/PKG-INFO 2014-07-30 18:41:23.000000000
+0200
@@ -1,6 +1,6 @@
-Metadata-Version: 1.0
+Metadata-Version: 1.1
Name: sphinxcontrib-httpdomain
-Version: 1.2.0
+Version: 1.3.0
Summary: Sphinx domain for HTTP APIs
Home-page: http://bitbucket.org/birkenfeld/sphinx-contrib
Author: Hong Minhee
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/sphinxcontrib-httpdomain-1.2.0/setup.cfg
new/sphinxcontrib-httpdomain-1.3.0/setup.cfg
--- old/sphinxcontrib-httpdomain-1.2.0/setup.cfg 2013-10-19
09:45:53.000000000 +0200
+++ new/sphinxcontrib-httpdomain-1.3.0/setup.cfg 2014-07-30
18:41:23.000000000 +0200
@@ -1,12 +1,15 @@
[aliases]
upload_docs = build_sphinx upload_docs
+[bdist_wheel]
+universal = 1
+
[upload_docs]
repository = https://pypi.python.org/pypi
upload_dir = build/sphinx/html
[egg_info]
+tag_svn_revision = 0
tag_build =
tag_date = 0
-tag_svn_revision = 0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/sphinxcontrib-httpdomain-1.2.0/setup.py
new/sphinxcontrib-httpdomain-1.3.0/setup.py
--- old/sphinxcontrib-httpdomain-1.2.0/setup.py 2013-08-07 20:24:15.000000000
+0200
+++ new/sphinxcontrib-httpdomain-1.3.0/setup.py 2014-07-30 18:33:26.000000000
+0200
@@ -11,11 +11,14 @@
http://pythonhosted.org/sphinxcontrib-httpdomain/
'''
-requires = ['Sphinx>=1.0']
+requires = [
+ 'Sphinx >= 1.0',
+ 'six'
+]
setup(
name='sphinxcontrib-httpdomain',
- version='1.2.0',
+ version='1.3.0',
url='http://bitbucket.org/birkenfeld/sphinx-contrib',
download_url='http://pypi.python.org/pypi/sphinxcontrib-httpdomain',
license='BSD',
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/sphinxcontrib-httpdomain-1.2.0/sphinxcontrib/autohttp/bottle.py
new/sphinxcontrib-httpdomain-1.3.0/sphinxcontrib/autohttp/bottle.py
--- old/sphinxcontrib-httpdomain-1.2.0/sphinxcontrib/autohttp/bottle.py
2013-08-07 20:24:15.000000000 +0200
+++ new/sphinxcontrib-httpdomain-1.3.0/sphinxcontrib/autohttp/bottle.py
2014-07-30 18:33:17.000000000 +0200
@@ -11,10 +11,7 @@
"""
import re
-try:
- import cStringIO as StringIO
-except ImportError:
- import StringIO
+import six
from docutils import nodes
from docutils.parsers.rst import directives
@@ -31,12 +28,17 @@
def translate_bottle_rule(app, rule):
- buf = StringIO.StringIO()
- for name, filter, conf in app.router.parse_rule(rule):
+ buf = six.StringIO()
+ if hasattr(app.router, "parse_rule"):
+ iterator = app.router.parse_rule(rule) # bottle 0.11
+ else:
+ iterator = app.router._itertokens(rule) # bottle 0.12
+ for name, filter, conf in iterator:
if filter:
buf.write('(')
buf.write(name)
- if filter != app.router.default_filter or conf:
+ if (filter != app.router.default_filter and filter != 'default')\
+ or conf:
buf.write(':')
buf.write(filter)
if conf:
@@ -49,12 +51,9 @@
def get_routes(app):
- for rule, methods in app.router.rules.iteritems():
- for method, target in methods.iteritems():
- if method in ('OPTIONS', 'HEAD'):
- continue
- path = translate_bottle_rule(app, rule)
- yield method, path, target
+ for route in app.routes:
+ path = translate_bottle_rule(app, route.rule)
+ yield route.method, path, route
class AutobottleDirective(Directive):
@@ -89,7 +88,7 @@
continue
view = target.callback
docstring = view.__doc__ or ''
- if not isinstance(docstring, unicode):
+ if not isinstance(docstring, six.text_type):
analyzer = ModuleAnalyzer.for_module(view.__module__)
docstring = force_decode(docstring, analyzer.encoding)
if not docstring and 'include-empty-docstring' not in self.options:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/sphinxcontrib-httpdomain-1.2.0/sphinxcontrib/autohttp/common.py
new/sphinxcontrib-httpdomain-1.3.0/sphinxcontrib/autohttp/common.py
--- old/sphinxcontrib-httpdomain-1.2.0/sphinxcontrib/autohttp/common.py
2013-08-07 20:24:08.000000000 +0200
+++ new/sphinxcontrib-httpdomain-1.3.0/sphinxcontrib/autohttp/common.py
2014-07-30 18:33:17.000000000 +0200
@@ -8,15 +8,15 @@
:license: BSD, see LICENSE for details.
"""
-
-import __builtin__
-
+import six
+from six.moves import builtins
+from six.moves import reduce
def import_object(import_name):
module_name, expr = import_name.split(':', 1)
mod = __import__(module_name)
mod = reduce(getattr, module_name.split('.')[1:], mod)
- globals = __builtin__
+ globals = builtins
if not isinstance(globals, dict):
globals = globals.__dict__
return eval(expr, globals, mod.__dict__)
@@ -24,7 +24,7 @@
def http_directive(method, path, content):
method = method.lower().strip()
- if isinstance(content, basestring):
+ if isinstance(content, six.string_types):
content = content.splitlines()
yield ''
yield '.. http:{method}:: {path}'.format(**locals())
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/sphinxcontrib-httpdomain-1.2.0/sphinxcontrib/autohttp/flask.py
new/sphinxcontrib-httpdomain-1.3.0/sphinxcontrib/autohttp/flask.py
--- old/sphinxcontrib-httpdomain-1.2.0/sphinxcontrib/autohttp/flask.py
2013-08-07 20:24:15.000000000 +0200
+++ new/sphinxcontrib-httpdomain-1.3.0/sphinxcontrib/autohttp/flask.py
2014-07-30 18:33:17.000000000 +0200
@@ -11,10 +11,7 @@
"""
import re
-try:
- import cStringIO as StringIO
-except ImportError:
- import StringIO
+import six
from docutils import nodes
from docutils.parsers.rst import directives
@@ -32,7 +29,7 @@
def translate_werkzeug_rule(rule):
from werkzeug.routing import parse_rule
- buf = StringIO.StringIO()
+ buf = six.StringIO()
for conv, arg, var in parse_rule(rule):
if conv:
buf.write('(')
@@ -97,7 +94,7 @@
app = import_object(self.arguments[0])
for method, path, endpoint in get_routes(app):
try:
- blueprint, endpoint_internal = endpoint.split('.')
+ blueprint, _, endpoint_internal = endpoint.rpartition('.')
if self.blueprints and blueprint not in self.blueprints:
continue
if blueprint in self.undoc_blueprints:
@@ -122,9 +119,10 @@
meth_func = getattr(view.view_class, method.lower(), None)
if meth_func and meth_func.__doc__:
docstring = meth_func.__doc__
- if not isinstance(docstring, unicode):
+ if not isinstance(docstring, six.text_type):
analyzer = ModuleAnalyzer.for_module(view.__module__)
docstring = force_decode(docstring, analyzer.encoding)
+
if not docstring and 'include-empty-docstring' not in self.options:
continue
docstring = prepare_docstring(docstring)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/sphinxcontrib-httpdomain-1.2.0/sphinxcontrib/autohttp/tornado.py
new/sphinxcontrib-httpdomain-1.3.0/sphinxcontrib/autohttp/tornado.py
--- old/sphinxcontrib-httpdomain-1.2.0/sphinxcontrib/autohttp/tornado.py
2013-10-19 09:41:10.000000000 +0200
+++ new/sphinxcontrib-httpdomain-1.3.0/sphinxcontrib/autohttp/tornado.py
2014-07-30 18:33:17.000000000 +0200
@@ -12,10 +12,7 @@
import inspect
import re
-try:
- import cStringIO as StringIO
-except ImportError:
- import StringIO
+import six
from docutils import nodes
from docutils.parsers.rst import directives
@@ -32,7 +29,7 @@
def translate_tornado_rule(app, rule):
- buf = StringIO.StringIO()
+ buf = six.StringIO()
for name, filter, conf in app.router.parse_rule(rule):
if filter:
buf.write('(')
@@ -52,19 +49,17 @@
def get_routes(app):
for spec in app.handlers[0][1]:
handler = spec.handler_class
- methods = inspect.getmembers(handler, predicate=inspect.ismethod)
-
doc_methods = list(handler.SUPPORTED_METHODS)
if 'HEAD' in doc_methods:
doc_methods.remove('HEAD')
if 'OPTIONS' in doc_methods:
doc_methods.remove('OPTIONS')
- for method in methods:
- if method[0].upper() not in doc_methods:
- continue
- path = spec.regex.pattern
- yield method[0], path, handler
+ for method in doc_methods:
+ maybe_method = getattr(handler, method.lower(), None)
+ if (inspect.isfunction(maybe_method) or
+ inspect.ismethod(maybe_method)):
+ yield method.lower(), spec.regex.pattern, handler
def normalize_path(path):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/sphinxcontrib-httpdomain-1.2.0/sphinxcontrib/httpdomain.py
new/sphinxcontrib-httpdomain-1.3.0/sphinxcontrib/httpdomain.py
--- old/sphinxcontrib-httpdomain-1.2.0/sphinxcontrib/httpdomain.py
2013-10-19 09:36:08.000000000 +0200
+++ new/sphinxcontrib-httpdomain-1.3.0/sphinxcontrib/httpdomain.py
2014-07-30 18:33:17.000000000 +0200
@@ -16,65 +16,130 @@
from pygments.lexer import RegexLexer, bygroups
from pygments.lexers import get_lexer_by_name
-from pygments.token import Literal, Text, Operator, Keyword, Name, Number
+from pygments.token import Literal, Text, Operator, Keyword, Name, Number
from pygments.util import ClassNotFound
from sphinx import addnodes
from sphinx.roles import XRefRole
from sphinx.domains import Domain, ObjType, Index
-from sphinx.directives import ObjectDescription
+from sphinx.directives import ObjectDescription, directives
from sphinx.util.nodes import make_refnode
from sphinx.util.docfields import GroupedField, TypedField
class DocRef(object):
- """Represents a link to an RFC which defines an HTTP method."""
+ """Represents a reference to an abstract specification."""
def __init__(self, base_url, anchor, section):
- """Stores the specified attributes which represent a URL which links to
- an RFC which defines an HTTP method.
-
- """
self.base_url = base_url
self.anchor = anchor
self.section = section
def __repr__(self):
- """Returns the URL which this object represents, which points to the
- location of the RFC which defines some HTTP method.
+ """Returns the URL onto related specification section for the related
+ object."""
+ return '{0}#{1}{2}'.format(self.base_url, self.anchor, self.section)
+
+
+class RFC2616Ref(DocRef):
+ """Represents a reference to RFC2616."""
+
+ def __init__(self, section):
+ url = 'http://www.w3.org/Protocols/rfc2616/rfc2616-sec{0:d}.html'
+ url = url.format(int(section))
+ super(RFC2616Ref, self).__init__(url, 'sec', section)
- """
- return '{}#{}{}'.format(self.base_url, self.anchor, self.section)
+class IETFRef(DocRef):
+ """Represents a reference to the specific IETF RFC."""
-#: The URL of the HTTP/1.1 RFC which defines the HTTP methods OPTIONS, GET,
-#: HEAD, POST, PUT, DELETE, TRACE, and CONNECT.
-RFC2616 = 'http://www.w3.org/Protocols/rfc2616/rfc2616-sec9.html'
+ def __init__(self, rfc, section):
+ url = 'http://tools.ietf.org/html/rfc{0:d}'.format(rfc)
+ super(IETFRef, self).__init__(url, 'section-', section)
-#: The name to use for section anchors in RFC2616.
-RFC2616ANCHOR = 'sec'
-#: The URL of the RFC which defines the HTTP PATCH method.
-RFC5789 = 'http://tools.ietf.org/html/rfc5789'
+class EventSourceRef(DocRef):
+
+ def __init__(self, section):
+ url = 'http://www.w3.org/TR/eventsource/'
+ super(EventSourceRef, self).__init__(url, section, '')
-#: The name to use for section anchors in RFC5789.
-RFC5789ANCHOR = 'section-'
#: Mapping from lowercase HTTP method name to :class:`DocRef` object which
#: maintains the URL which points to the section of the RFC which defines that
#: HTTP method.
-DOCREFS = {
- 'patch': DocRef(RFC5789, RFC5789ANCHOR, 2),
- 'options': DocRef(RFC2616, RFC2616ANCHOR, 9.2),
- 'get': DocRef(RFC2616, RFC2616ANCHOR, 9.3),
- 'head': DocRef(RFC2616, RFC2616ANCHOR, 9.4),
- 'post': DocRef(RFC2616, RFC2616ANCHOR, 9.5),
- 'put': DocRef(RFC2616, RFC2616ANCHOR, 9.6),
- 'delete': DocRef(RFC2616, RFC2616ANCHOR, 9.7),
- 'trace': DocRef(RFC2616, RFC2616ANCHOR, 9.8),
- 'connect': DocRef(RFC2616, RFC2616ANCHOR, 9.9)
+METHOD_REFS = {
+ 'patch': IETFRef(5789, 2),
+ 'options': RFC2616Ref(9.2),
+ 'get': RFC2616Ref(9.3),
+ 'head': RFC2616Ref(9.4),
+ 'post': RFC2616Ref(9.5),
+ 'put': RFC2616Ref(9.6),
+ 'delete': RFC2616Ref(9.7),
+ 'trace': RFC2616Ref(9.8),
+ 'connect': RFC2616Ref(9.9),
+ 'copy': IETFRef(2518, 8.8),
+ 'any': ''
+}
+
+
+#: Mapping from HTTP header name to :class:`DocRef` object which
+#: maintains the URL which points to the related section of the RFC.
+HEADER_REFS = {
+ 'Accept': RFC2616Ref(14.1),
+ 'Accept-Charset': RFC2616Ref(14.2),
+ 'Accept-Encoding': RFC2616Ref(14.3),
+ 'Accept-Language': RFC2616Ref(14.4),
+ 'Accept-Ranges': RFC2616Ref(14.5),
+ 'Age': RFC2616Ref(14.6),
+ 'Allow': RFC2616Ref(14.7),
+ 'Authorization': RFC2616Ref(14.8),
+ 'Cache-Control': RFC2616Ref(14.9),
+ 'Connection': RFC2616Ref(14.10),
+ 'Content-Encoding': RFC2616Ref(14.11),
+ 'Content-Language': RFC2616Ref(14.12),
+ 'Content-Length': RFC2616Ref(14.13),
+ 'Content-Location': RFC2616Ref(14.14),
+ 'Content-MD5': RFC2616Ref(14.15),
+ 'Content-Range': RFC2616Ref(14.16),
+ 'Content-Type': RFC2616Ref(14.17),
+ 'Cookie': IETFRef(2109, '4.3.4'),
+ 'Date': RFC2616Ref(14.18),
+ 'Destination': IETFRef(2518, 9.3),
+ 'ETag': RFC2616Ref(14.19),
+ 'Expect': RFC2616Ref(14.20),
+ 'Expires': RFC2616Ref(14.21),
+ 'From': RFC2616Ref(14.22),
+ 'Host': RFC2616Ref(14.23),
+ 'If-Match': RFC2616Ref(14.24),
+ 'If-Modified-Since': RFC2616Ref(14.25),
+ 'If-None-Match': RFC2616Ref(14.26),
+ 'If-Range': RFC2616Ref(14.27),
+ 'If-Unmodified-Since': RFC2616Ref(14.28),
+ 'Last-Event-ID': EventSourceRef('last-event-id'),
+ 'Last-Modified': RFC2616Ref(14.29),
+ 'Location': RFC2616Ref(14.30),
+ 'Max-Forwards': RFC2616Ref(14.31),
+ 'Pragma': RFC2616Ref(14.32),
+ 'Proxy-Authenticate': RFC2616Ref(14.33),
+ 'Proxy-Authorization': RFC2616Ref(14.34),
+ 'Range': RFC2616Ref(14.35),
+ 'Referer': RFC2616Ref(14.36),
+ 'Retry-After': RFC2616Ref(14.37),
+ 'Server': RFC2616Ref(14.38),
+ 'Set-Cookie': IETFRef(2109, '4.2.2'),
+ 'TE': RFC2616Ref(14.39),
+ 'Trailer': RFC2616Ref(14.40),
+ 'Transfer-Encoding': RFC2616Ref(14.41),
+ 'Upgrade': RFC2616Ref(14.42),
+ 'User-Agent': RFC2616Ref(14.43),
+ 'Vary': RFC2616Ref(14.44),
+ 'Via': RFC2616Ref(14.45),
+ 'Warning': RFC2616Ref(14.46),
+ 'WWW-Authenticate': RFC2616Ref(14.47)
}
+
HTTP_STATUS_CODES = {
100: 'Continue',
101: 'Switching Protocols',
@@ -133,8 +198,19 @@
re.VERBOSE)
+def sort_by_method(entries):
+ def cmp(item):
+ order = ['HEAD', 'GET', 'POST', 'PUT', 'DELETE', 'PATCH',
+ 'OPTIONS', 'TRACE', 'CONNECT', 'COPY', 'ANY']
+ method = item[0].split(' ', 1)[0]
+ if method in order:
+ return order.index(method)
+ return 100
+ return sorted(entries, key=cmp)
+
+
def http_resource_anchor(method, path):
- path = re.sub(r'[<>:/]', '-', path)
+ path = re.sub(r'[{}]', '', re.sub(r'[<>:/]', '-', path))
return method.lower() + '-' + path
@@ -144,38 +220,60 @@
TypedField('parameter', label='Parameters',
names=('param', 'parameter', 'arg', 'argument'),
typerolename='obj', typenames=('paramtype', 'type')),
- TypedField('jsonparameter', label='Json Parameters',
+ TypedField('jsonparameter', label='JSON Parameters',
names=('jsonparameter', 'jsonparam', 'json'),
typerolename='obj', typenames=('jsonparamtype',
'jsontype')),
+ TypedField('requestjsonobject', label='Request JSON Object',
+ names=('reqjsonobj', 'reqjson', '<jsonobj', '<json'),
+ typerolename='obj', typenames=('reqjsonobj', '<jsonobj')),
+ TypedField('requestjsonarray', label='Request JSON Array of Objects',
+ names=('reqjsonarr', '<jsonarr'),
+ typerolename='obj',
+ typenames=('reqjsonarrtype', '<jsonarrtype')),
+ TypedField('responsejsonobject', label='Response JSON Object',
+ names=('resjsonobj', 'resjson', '>jsonobj', '>json'),
+ typerolename='obj', typenames=('resjsonobj', '>jsonobj')),
+ TypedField('responsejsonarray', label='Response JSON Array of Objects',
+ names=('resjsonarr', '>jsonarr'),
+ typerolename='obj',
+ typenames=('resjsonarrtype', '>jsonarrtype')),
TypedField('queryparameter', label='Query Parameters',
- names=('queryparameter', 'queryparam', 'qparam', 'query'),
- typerolename='obj', typenames=('queryparamtype',
'querytype', 'qtype')),
+ names=('queryparameter', 'queryparam', 'qparam', 'query'),
+ typerolename='obj',
+ typenames=('queryparamtype', 'querytype', 'qtype')),
GroupedField('formparameter', label='Form Parameters',
names=('formparameter', 'formparam', 'fparam', 'form')),
GroupedField('requestheader', label='Request Headers',
rolename='mailheader',
- names=('reqheader', 'requestheader')),
+ names=('<header', 'reqheader', 'requestheader')),
GroupedField('responseheader', label='Response Headers',
rolename='mailheader',
- names=('resheader', 'responseheader')),
+ names=('>header', 'resheader', 'responseheader')),
GroupedField('statuscode', label='Status Codes',
rolename='statuscode',
names=('statuscode', 'status', 'code'))
]
+ option_spec = {
+ 'deprecated': directives.flag,
+ 'noindex': directives.flag,
+ 'synopsis': lambda x: x,
+ }
+
method = NotImplemented
def handle_signature(self, sig, signode):
method = self.method.upper() + ' '
signode += addnodes.desc_name(method, method)
offset = 0
+ path = None
for match in http_sig_param_re.finditer(sig):
path = sig[offset:match.start()]
signode += addnodes.desc_name(path, path)
params = addnodes.desc_parameterlist()
typ = match.group('type')
if typ:
- typ = typ + ': '
+ typ += ': '
params += addnodes.desc_annotation(typ, typ)
name = match.group('name')
params += addnodes.desc_parameter(name, name)
@@ -184,6 +282,7 @@
if offset < len(sig):
path = sig[offset:len(sig)]
signode += addnodes.desc_name(path, path)
+ assert path is not None, 'no matches for sig: %s' % sig
fullname = self.method.upper() + ' ' + path
signode['method'] = self.method
signode['path'] = sig
@@ -195,7 +294,11 @@
def add_target_and_index(self, name_cls, sig, signode):
signode['ids'].append(http_resource_anchor(*name_cls[1:]))
- self.env.domaindata['http'][self.method][sig] = (self.env.docname, '')
+ if 'noindex' not in self.options:
+ self.env.domaindata['http'][self.method][sig] = (
+ self.env.docname,
+ self.options.get('synopsis', ''),
+ 'deprecated' in self.options)
def get_index_text(self, modname, name):
return ''
@@ -241,8 +344,27 @@
method = 'trace'
+class HTTPConnect(HTTPResource):
+
+ method = 'connect'
+
+
+class HTTPCopy(HTTPResource):
+
+ method = 'copy'
+
+
+class HTTPAny(HTTPResource):
+
+ method = 'any'
+
+
def http_statuscode_role(name, rawtext, text, lineno, inliner,
- options={}, content=[]):
+ options=None, content=None):
+ if options is None:
+ options = {}
+ if content is None:
+ content = []
if text.isdigit():
code = int(text)
try:
@@ -268,11 +390,10 @@
nodes.reference(rawtext)
if code == 226:
url = 'http://www.ietf.org/rfc/rfc3229.txt'
- if code == 418:
+ elif code == 418:
url = 'http://www.ietf.org/rfc/rfc2324.txt'
- if code == 449:
- url = 'http://msdn.microsoft.com/en-us/library' \
- '/dd891478(v=prot.10).aspx'
+ elif code == 449:
+ url =
'http://msdn.microsoft.com/en-us/library/dd891478(v=prot.10).aspx'
elif code in HTTP_STATUS_CODES:
url = 'http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html' \
'#sec10.' + ('%d.%d' % (code // 100, 1 + code % 100))
@@ -285,18 +406,45 @@
def http_method_role(name, rawtext, text, lineno, inliner,
- options={}, content=[]):
+ options=None, content=None):
+ if options is None:
+ options = {}
+ if content is None:
+ content = []
method = str(text).lower()
- if method not in DOCREFS:
+ if method not in METHOD_REFS:
msg = inliner.reporter.error('%s is not valid HTTP method' % method,
lineno=lineno)
prb = inliner.problematic(rawtext, rawtext, msg)
return [prb], [msg]
- url = str(DOCREFS[method])
+ url = str(METHOD_REFS[method])
+ if not url:
+ return [nodes.emphasis(text, text)], []
node = nodes.reference(rawtext, method.upper(), refuri=url, **options)
return [node], []
+def http_header_role(name, rawtext, text, lineno, inliner,
+ options=None, content=None):
+ if options is None:
+ options = {}
+ if content is None:
+ content = []
+ header = str(text)
+ if header not in HEADER_REFS:
+ header = header.title()
+ if header not in HEADER_REFS:
+ if header.startswith('X-'): # skip custom headers
+ return [nodes.strong(header, header)], []
+ msg = inliner.reporter.error('%s is not unknown HTTP header' % header,
+ lineno=lineno)
+ prb = inliner.problematic(rawtext, rawtext, msg)
+ return [prb], [msg]
+ url = str(HEADER_REFS[header])
+ node = nodes.reference(rawtext, header, refuri=url, **options)
+ return [node], []
+
+
class HTTPXRefRole(XRefRole):
def __init__(self, method, **kwargs):
@@ -320,9 +468,16 @@
def __init__(self, *args, **kwargs):
super(HTTPIndex, self).__init__(*args, **kwargs)
- self.ignore = [[l for l in x.split('/') if l]
+ self.ignore = [
+ [l for l in x.split('/') if l]
for x in self.domain.env.config['http_index_ignore_prefixes']]
- self.ignore.sort(key=lambda x: -len(x))
+ self.ignore.sort(reverse=True)
+
+ # During HTML generation these values pick from class,
+ # not from instance so we have a little hack the system
+ cls = self.__class__
+ cls.shortname = self.domain.env.config['http_index_shortname']
+ cls.localname = self.domain.env.config['http_index_localname']
def grouping_prefix(self, path):
letters = [x for x in path.split('/') if x]
@@ -334,17 +489,21 @@
def generate(self, docnames=None):
content = {}
items = ((method, path, info)
- for method, routes in self.domain.routes.items()
- for path, info in routes.items())
+ for method, routes in self.domain.routes.items()
+ for path, info in routes.items())
items = sorted(items, key=lambda item: item[1])
for method, path, info in items:
entries = content.setdefault(self.grouping_prefix(path), [])
entries.append([
method.upper() + ' ' + path, 0, info[0],
- http_resource_anchor(method, path), '', '', info[1]
+ http_resource_anchor(method, path),
+ '', 'Deprecated' if info[2] else '', info[1]
])
- content = sorted(content.items(), key=lambda k: k[0])
- return (content, True)
+ items = sorted(
+ (path, sort_by_method(entries))
+ for path, entries in content.items()
+ )
+ return (items, True)
class HTTPDomain(Domain):
@@ -361,7 +520,10 @@
'put': ObjType('put', 'put', 'obj'),
'patch': ObjType('patch', 'patch', 'obj'),
'delete': ObjType('delete', 'delete', 'obj'),
- 'trace': ObjType('trace', 'trace', 'obj')
+ 'trace': ObjType('trace', 'trace', 'obj'),
+ 'connect': ObjType('connect', 'connect', 'obj'),
+ 'copy': ObjType('copy', 'copy', 'obj'),
+ 'any': ObjType('any', 'any', 'obj')
}
directives = {
@@ -372,7 +534,10 @@
'put': HTTPPut,
'patch': HTTPPatch,
'delete': HTTPDelete,
- 'trace': HTTPTrace
+ 'trace': HTTPTrace,
+ 'connect': HTTPConnect,
+ 'copy': HTTPCopy,
+ 'any': HTTPAny
}
roles = {
@@ -384,19 +549,26 @@
'patch': HTTPXRefRole('patch'),
'delete': HTTPXRefRole('delete'),
'trace': HTTPXRefRole('trace'),
+ 'connect': HTTPXRefRole('connect'),
+ 'copy': HTTPXRefRole('copy'),
+ 'any': HTTPXRefRole('any'),
'statuscode': http_statuscode_role,
- 'method': http_method_role
+ 'method': http_method_role,
+ 'header': http_header_role
}
initial_data = {
- 'options': {}, # path: (docname, synopsis)
+ 'options': {}, # path: (docname, synopsis)
'head': {},
'post': {},
'get': {},
'put': {},
'patch': {},
'delete': {},
- 'trace': {}
+ 'trace': {},
+ 'connect': {},
+ 'copy': {},
+ 'any': {}
}
indices = [HTTPIndex]
@@ -416,7 +588,13 @@
try:
info = self.data[str(typ)][target]
except KeyError:
- return
+ text = contnode.rawsource
+ if typ == 'statuscode':
+ return http_statuscode_role(None, text, text, None, None)[0][0]
+ elif typ == 'mailheader':
+ return http_header_role(None, text, text, None, None)[0][0]
+ else:
+ return nodes.emphasis(text, text)
else:
anchor = http_resource_anchor(typ, target)
title = typ.upper() + ' ' + target
@@ -474,7 +652,7 @@
tokens = {
'root': [
- (r'(GET|POST|PUT|PATCH|DELETE|HEAD|OPTIONS|TRACE)( +)([^ ]+)( +)'
+ (r'(GET|POST|PUT|PATCH|DELETE|HEAD|OPTIONS|TRACE|COPY)( +)([^ ]+)(
+)'
r'(HTTPS?)(/)(1\.[01])(\r?\n|$)',
bygroups(Name.Function, Text, Name.Namespace, Text,
Keyword.Reserved, Operator, Number, Text),
@@ -502,4 +680,5 @@
except ClassNotFound:
app.add_lexer('http', HTTPLexer())
app.add_config_value('http_index_ignore_prefixes', [], None)
-
+ app.add_config_value('http_index_shortname', 'routing table', True)
+ app.add_config_value('http_index_localname', 'HTTP Routing Table', True)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/sphinxcontrib-httpdomain-1.2.0/sphinxcontrib_httpdomain.egg-info/PKG-INFO
new/sphinxcontrib-httpdomain-1.3.0/sphinxcontrib_httpdomain.egg-info/PKG-INFO
---
old/sphinxcontrib-httpdomain-1.2.0/sphinxcontrib_httpdomain.egg-info/PKG-INFO
2013-10-19 09:45:53.000000000 +0200
+++
new/sphinxcontrib-httpdomain-1.3.0/sphinxcontrib_httpdomain.egg-info/PKG-INFO
2014-07-30 18:41:22.000000000 +0200
@@ -1,6 +1,6 @@
-Metadata-Version: 1.0
+Metadata-Version: 1.1
Name: sphinxcontrib-httpdomain
-Version: 1.2.0
+Version: 1.3.0
Summary: Sphinx domain for HTTP APIs
Home-page: http://bitbucket.org/birkenfeld/sphinx-contrib
Author: Hong Minhee
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/sphinxcontrib-httpdomain-1.2.0/sphinxcontrib_httpdomain.egg-info/requires.txt
new/sphinxcontrib-httpdomain-1.3.0/sphinxcontrib_httpdomain.egg-info/requires.txt
---
old/sphinxcontrib-httpdomain-1.2.0/sphinxcontrib_httpdomain.egg-info/requires.txt
2013-10-19 09:45:53.000000000 +0200
+++
new/sphinxcontrib-httpdomain-1.3.0/sphinxcontrib_httpdomain.egg-info/requires.txt
2014-07-30 18:41:22.000000000 +0200
@@ -1 +1,2 @@
-Sphinx>=1.0
\ No newline at end of file
+Sphinx >= 1.0
+six
\ No newline at end of file
--
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]