docbook builds intermittently fail with with duplicate ID error, detected by 
FOP (PDF build) processing
-------------------------------------------------------------------------------------------------------

                 Key: UIMA-2288
                 URL: https://issues.apache.org/jira/browse/UIMA-2288
             Project: UIMA
          Issue Type: Improvement
          Components: Build, Packaging and Test
    Affects Versions: build-resources-2
         Environment: Windows, multi-core machine
            Reporter: Marshall Schor
            Assignee: Marshall Schor
            Priority: Minor
             Fix For: build-resources-3


The docbkx process runs an xslt transformation over the source to produce a .fo 
input file for FOP to produce the PDF.  From time to time, this transformation 
will generate xml <fo:block id="dxxxx"> elements, with identical "id"s, which 
FOP then checks for and stops running when it finds one (duplicate ids are 
illegal in xml). 

These ids appear not to be used.  I looked through one file - the UIMA tools 
guide, and didn't see any use.

The problem is intermittent, and may be related to multi-core machines.  On my 
latest laptop (which has 4 real cores), I'm often getting this error.  
Rerunning a few times usually gets me around this.  I can increase the 
likelihood that rerunning works, by kicking off some other CPU-intensive work 
on my laptop, while the xslt transformation is going on

These generated <fo:block id="dxxx"> elements with duplicate ids always appear 
around the embedding of external graphics.  

Fix this by finding the place in the xslt transform templates which generate 
this, and remove the id=... part; add this as a customization override template 
to our docbook process.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Reply via email to