Parsing of WikiDocument cuts too long lines/paragraphs
------------------------------------------------------
Key: JSPWIKI-527
URL: https://issues.apache.org/jira/browse/JSPWIKI-527
Project: JSPWiki
Issue Type: Bug
Components: Core & storage
Affects Versions: 2.8.1
Environment: Win XP & LInux
Reporter: Jochen Reutelshoefer
After the filters are run the WikiDocument is re-parsed if the filter changed
some data. If there is some too long content without linebreaks the paragraph
gets cut. (> 10000 characters)
I have a pageFilter that creates (longer) HTML-output. If I dont put in any
linebreaks, it gets broken/trimmed always to same length (destroying
HTML-Structure of page)..
Might have something to do with the alorithm parsing that tree of Content
objects and/or the used data-structures.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.