2006/6/18, Manuel Mall <[EMAIL PROTECTED]>:
It has been reported on fop-user that the current keep implementation is
probably too strict, that is instead of weakening a keep condition if
no solution can be found FOP trunk currently gives up with an error
message.

I had a quick fiddle in one area and changed the Knuth penalty generated
for a keep...="always" from INFINITE to INFINITE-1. In my few test
cases that seem to have resolved the issue.

In fact you have turned the corresponding element into a legal
breakpoint, which was not the case when the penalty was infinite. Thus
the element was not even considered by the algorithm as a possible
breakpoint.


However, I am interested in comments of those more familiar with the
mathematical background behind the Knuth algorithm if such a solution
is appropriate or if there could be unintended side effects, e.g. this
INFINITE-1 break being chosen even if there are other allowed breaks
which should be preferred according to the FO input but have higher
penalties when run through the Knuth algorithm.

The new legal breakpoints turn out to be feasible breakpoints, but with
very high demerits. In fact there are two different infinities here: the
value of infinite penalties (1000 in Fop, IIC), and the values of
demerits. In Fop the set of best active nodes is initialized to an empty
set, and the corresponding demerits are initialized to
Double.POSITIVE_INFINITY (see layoutmgr.BreakingAlgorithm.BestRecords).
As soon as a feasible breakpoint is found it will be recorded, for its
demerits, even high, will be inferior to POSITIVE_INFINITY. You end up
now with a non-empty set of "very bad" breakpoints.

This should work, as the formula used to compute demerits will favour a
very loose page to a page which ends at a very high penalty item.
However...


Or should we use a more refined approach were we generate initially an
INFINITE penalty but if the page breaking cannot find a solution we
reduce the penalty on some/all of those elements given an INFINITE
penalty because of keeps and run the page breaker again?

... I think this is a better idea. I'm not completely sure that the
first approach really has no side-effects, and especially if it makes it
possible to match the spec's requirements (last paragraph of ยง4.8).
Restarting the algorithm with relaxed constraints is more in the spirit
of the Knuth algorithm.


Vincent

Reply via email to