[
https://issues.apache.org/jira/browse/COUCHDB-604?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12793189#action_12793189
]
Joscha Feth commented on COUCHDB-604:
-------------------------------------
Regarding the
"If there exists a SAX-style json parser, then that might be another approach,
but I'm not aware of one."
just that it does not exist "in the wild" it does not neccessarily mean there
isn't one - we have a JSON Push Parser which can be compared to a SAX-like
parser - it does not get a reader, but uses a writer - and once data is written
on that writer, the parser will do it's work (and emit the above named events).
This Push Parser has now an extension to handle continous writes of distinct
JSON objects ({}{}{}...) like the _changes feed delivers in continuous mode,
but it might be different with a different parser out there, which expects the
feed to return valid JSON data.
That there is no end does not mean, that there might not be a parser which can
not work with the data.
Start Document
Starting seq
Starting id
Starting changes
Starting rev
Closing rev
...
... <-- lots of changes
...
long time later
End Document <-- timeout of _changes feed happens here
By the way, XMPP works the same - it basically has an infinitely long stream of
XML elements flowing - but still starts with a root node and ends with a
closing one to be valid XML.
You might call it a feature request, but I think either the output should be
valid JSON or not. If you tell me the output is not valid JSON, okay, but I
couldn't read this from the docs, as all other _changes interfaces return valid
JSON.
> _changes feed with ?feed=continuous does not return valid JSON
> --------------------------------------------------------------
>
> Key: COUCHDB-604
> URL: https://issues.apache.org/jira/browse/COUCHDB-604
> Project: CouchDB
> Issue Type: Bug
> Components: HTTP Interface
> Affects Versions: 0.10
> Reporter: Joscha Feth
>
> When using the _changes interface via ?feed=continuous the JSON returned is
> rather
> a stream of JSON documents than a valid JSON file itself:
> {"seq":38,"id":"f473fe61a8a53778d91c38b23ed6e20f","changes":[{"rev":"9-d3e71c7f5f991b26fe014d884a27087f"}]}
> {"seq":68,"id":"2a574814d61d9ec8a0ebbf43fa03d75b","changes":[{"rev":"6-67179f215e42d63092dc6b2199a3bf51"}],"deleted":true}
> {"seq":70,"id":"75dbdacca8e475f5909e3cc298905ef8","changes":[{"rev":"1-0dee261a2bd4c7fb7f2abd811974d3f8"}]}
> {"seq":71,"id":"09fb03236f80ea0680a3909c2d788e43","changes":[{"rev":"1-a9646389608c13a5c26f4c14c6863753"}]}
> to be valid there needs to be a root element (and then an array with commata)
> like in the non-continuous feed:
> {"results":[
> {"seq":38,"id":"f473fe61a8a53778d91c38b23ed6e20f","changes":[{"rev":"9-d3e71c7f5f991b26fe014d884a27087f"}]},
> {"seq":68,"id":"2a574814d61d9ec8a0ebbf43fa03d75b","changes":[{"rev":"6-67179f215e42d63092dc6b2199a3bf51"}],"deleted":true},
> {"seq":70,"id":"75dbdacca8e475f5909e3cc298905ef8","changes":[{"rev":"1-0dee261a2bd4c7fb7f2abd811974d3f8"}]},
> {"seq":71,"id":"09fb03236f80ea0680a3909c2d788e43","changes":[{"rev":"1-a9646389608c13a5c26f4c14c6863753"}]},
> in short this means that if someone does not parse the change events in an
> object like manner (e.g. waiting for a line-ending and then parsing the
> line), but using a SAX-like parser (throwing events of each new object, etc.)
> and expecting the response to be JSON (which it is not, because its not
> {x:[{},{},{}]} but {}{}{} which is not valid) there is an error thrown.
> I can see, that people doing this line by line might be okay with the above
> approach, but the response is not valid JSON and it would be nice if there
> were a flag to make the response valid JSON.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.