Hello everybody, I presented "Recent Development Directions for makeinfo (Patrice Dumas)" at the Gnu Hackers Meeting in 2011. Overall the quality of the interventions and of the people there were amazing.
http://www.gnu.org/ghm/2011/paris/ The summary of my intervention is: Texinfo is the official documentation format for the GNU project. However, the last release dates to 2008. This long period is due to many changes happening behind the scenes, first the merging of texi2html in Texinfo, to replace makeinfo, and then the replacement with a Parser transforming the Texinfo document into a tree further processed by Converters. After detailing this recent (and not so recent) history, a brief presentation of the Parser will show what's next for makeinfo. The video is here: http://audio-video.gnu.org/video/ghm2011/Patrice_Dumas-Makeinfo.ogv The slides are there: http://www.gnu.org/ghm/2011/paris/slides/patrice-dumas-texinfo.pdf Unfortunately the questions from the participants cannot really be heard. I found the reactions to be quite positive. Ralf already voiced his concern regarding m4. The current idea is not to really do something specific for m4, but to provide support for synchronization lines that could have been put in the texinfo code by m4 (or other preprocessor), like #line 34 file.texi Brian Gough told about the existing XML tree code processing modules existing, and proposed to reuse some. I investigated that a bit, but didn't really found what I was searching for, one reason being that currently the tree is a 'double' tree, when elements of the tree have both args and contents (like a @quotation with arg on the line and contents between the begining and the @end quotation). Jim Blandy proposed JSON as an intermediate format. I think it is a good idea, although not a priority. There was a very interesting intervention of Andy Wingo, who told that there was a guile parser/renderer for Texinfo, which was used for example for in-source rendering of texinfo fragments. He told that the intermediate lisp like format used by the interpreter was very similar with the one I presented during the talk. Based on this format, it is even possible to do some direct rendering of the texinfo, using guile and cairo for example. Then, they were very interested in using the Parser to output a Texinfo document in their tree representation such that they can render it themselves, the parsing of Texinfo being something not easy to do. I think that this is a very good idea and I would like to recontact Andy and do that output as soon as possible. It should not delay the release, though, but be something done right after, in my opinion. Andy also said that the XML currently produced was not right, which is something I agree with, although this is an issue which should hopefully be fixed now that the XML produced is more a mapping of the original Texinfo. On that subject, I had some interesting discussions with Bruno Haible (not within the talk, and this is a discussion we begun previously in a mini GHM in Bordeaux). To summarize, he would be interested in being able to use the Texinfo XML to extract units of text, paragraphs, titles, and the like. Those units of text then could be translated, next they could be reassembled into an XML texinfo document and last the Texinfo XML could be mapped back to Texinfo code. The Texinfo XML produced by the Parser is now (in my opinion) ready for that task, since it is done in a way such that it may be mapped back to some texinfo code without loss of (relevant) information. And units of text appear within a rather limited number of XML elements (mostlr paragraphs and preformatted and in line command arguments). A tool/xslt file to transform back Texinfo XML to Texinfo is still to be done to have the whole process doable. This is not a priority for me, but I could do it when I have time (at least in a few months) if nobody else does it first. A last side effect of the GHM was the creation of the texinfo-devel list for mail exchanges we previously kept private between mostly Karl and me. This is, in general, not very interesting, in my opinion, but it allows to come back and see what changes were discussed and the reasons for the choices and also to see where the development is heading. -- Pat
