Re: Changes in xdocs directories
On Wed, 2002-12-11 at 03:11, Peter B. West wrote: I found the image files in .../src/documentation/resources/images/design/alt.design. I'm not seeing any fop-cvs mail about these commits. Any idea why? Don't know why you didn't get it but it was on the list: http://marc.theaimsgroup.com/?l=fop-cvsm=103883046618393w=2 Peter - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]
RE: src/documentation/README
On Tue, 2002-12-10 at 19:17, Victor Mote wrote: OK, I see now that I misunderstood your Sorry, yes answer to Keiron. If http://forrestbot.cocoondev.org/site/xml-fop should be reflecting changes no more than an hour old made to xml-fop/src/documentation, then it is not working. I just looked for changes that were made about 24 hours ago, and they are not there. Also, as I mentioned, it shows a publish date of 12-7. So I suspect that something is not working. We have one document with a non-standard DTD (my fault, in fact that is what I am trying to work on) that might be messing up the flow. How do I go about troubleshooting this? Judging by the log it might be a DISPLAY problem. The compliance doc should only cause a broken link. Who is in charge of the cocoondev site, could they work out what the problem is? I think Sam Ruby has a script which automatically updates the live site to the contents of xml-site/targets/*. Should I contact him directly? Also, I still don't know whether xml-site/targets/fop is my final destination. What is best practices for this process? If this is documented somewhere, please excuse me -- I haven't found it yet. Also, I realize that this conversation might be better on forrest-dev. I asked on this list because I think Keiron has already figured most of this out I am trying to leverage off of that. Thanks very much for your help. Haven't figured out all of it. There is no real documentation for the old process, currently we are only really replacing the doc generation process. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]
Re: [ANNOUNCEMENT] FOP 0.20.5 Release Candidate available
Christian Geisert wrote: - Perfo[r]mance tuning Typo in CHANGES: bug number should read 14013, not 14103 (Cocoon bug)! - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]
Re: src/documentation/README
On Tue, Dec 10, 2002 at 11:17:25AM -0700, Victor Mote wrote: ... See http://forrestbot.cocoondev.org/site/xml-fop ... OK, I see now that I misunderstood your Sorry, yes answer to Keiron. If http://forrestbot.cocoondev.org/site/xml-fop should be reflecting changes no more than an hour old made to xml-fop/src/documentation, then it is not working. Hmm yes, something went wrong. It seems to be working now, but I'll keep an eye on it. The 'last published' in the footer is the best indicator of currency. I think Sam Ruby has a script which automatically updates the live site to the contents of xml-site/targets/*. Should I contact him directly? Also, I still don't know whether xml-site/targets/fop is my final destination. What is best practices for this process? If this is documented somewhere, please excuse me -- I haven't found it yet. Also, I realize that this conversation might be better on forrest-dev. I asked on this list because I think Keiron has already figured most of this out I am trying to leverage off of that. Thanks very much for your help. Forrest is in the same boat as FOP when it comes to site updates. AFAIK, there are no docs, but the process is: - Committers commit generated docs to xml-site/targets/{project} - Every X hours, a script updates /www/xml.apache.org/ or wherever on the live site, from CVS. Pretty messy, but this CVS-based site update system has some virtues: - it is pull-based, so fewer security risks - site contents can be reverted easily without an admin having to figure out how the doc generation tool works. - it's there and it works Discussions for creating a better site update system should probably be held on general@xml, since it affects all projects, or forrest-dev, since people there are particularly interested, and Steven has talked with Sam about it before. --Jeff Victor Mote - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]
cvs commit: xml-fop/src/documentation README
vmote 2002/12/11 11:06:38 Modified:src/documentation README Log: Expand with more details, and add some comments from Jeff Turner. Revision ChangesPath 1.2 +48 -23xml-fop/src/documentation/README Index: README === RCS file: /home/cvs/xml-fop/src/documentation/README,v retrieving revision 1.1 retrieving revision 1.2 diff -u -r1.1 -r1.2 --- README3 Dec 2002 10:06:10 - 1.1 +++ README11 Dec 2002 19:06:38 - 1.2 @@ -1,24 +1,49 @@ -To update the docs: - -The documentation is generated using forrest (http://xml.apache.org/forrest/). - -The current procedure is: - -- checkout xml-forrest module -- run: build.sh(bat) dist -- follow instructions to set FORREST_HOME and path -- go to xml-fop directory -- run forrest(.bat) - -The documents will then be placed in build/site/ - -NOTE: the compliance.html currently does not work, it can be fixed by -adding the dtd ref to: build/tmp/context/resources/schema/catalog -and placing the dtd in: build/tmp/context/resources/schema/dtd/ - -To update website -- put the generated docs into the xml-site module targets/fop/ - this could be done by simlinking the destination to the targets/fop -- commit the documents - +To update the FOP website: +Background +-- + 1. The documentation is generated using forrest +(http://xml.apache.org/forrest/). + 2. Forrest needs to be run on a machine with a graphical environment (it will +fail in a headless environment when it tries to use FOP to generate the PDF +files). The Apache machine available to xml-fop developers +(icarus.apache.org) appears to be headless, so you will probably need to run +this on a local machine with a graphical environment. + +Step-by-Step Instructions +- + 1. checkout the xml-forrest module (same repository as xml-fop). + 2. checkout the xml-site/targets/fop module (same repository as xml-fop). + 3. you will also need access to a current xml-fop sandbox (you probably already +have one) + 4. cd to xml-forrest + 5. run: build.sh(bat) dist to build forrest + 6. set environment variable FORREST_HOME=~/xml-forrest/build/dist/shbat +where ~ is the directory in which xml-forrest is installed +(see http://xml.apache.org/forrest/your-project.html for details) + 7. set environment variable PATH=$PATH:$FORREST_HOME/bin + 8. cd to xml-fop directory + 9. run forrest(.bat), which will build the web-site documents in +xml-fop/build/site. +10. NOTE: the compliance.html currently does not work, it can be fixed by +adding the dtd ref to: build/tmp/context/resources/schema/catalog +and placing the dtd in: build/tmp/context/resources/schema/dtd/ +11. To update the actual website, copy the generated documents +(in xml-fop/build/site) to xml-site/targets/fop. (This could also be done by +sym-linking this destination before the build.) +12. commit xml-site/targets/fop. + +Notes +- + 1. Per Jeff Turner, the downstream process of publishing our web site is as +follows: +- Committers commit generated docs to xml-site/targets/{project} +- Every X hours, a script updates /www/xml.apache.org/ or wherever on + the live site, from CVS. + 2. Per Jeff Turner, the FOP website is being regenerated (from the contents +of xml-site/targets/fop) by Forrest every hour. +See http://forrestbot.cocoondev.org/site/xml-fop for the contents. +Although we found this interesting (especially wondering how they got around +the headless server problem), it doesn't change our workflow above, because +we don't know where, at the filesystem level, these files exist, so we have +no way of copying them to xml-site/targets/fop. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: Changes in xdocs directories
Peter B. West wrote: I started to make some changes in the alt.design xdocs directory, and ran into problems immediately. Forrest didn't like something in one of the files, but when I went to have a look at the directory, I began to find CVS discrepancies. This morning I did a cvs update on .../src/documentation/content/xdocs/alt.design, and most of the files had gone away. Is this because of the forrest site build problems? I I don't think it is related to the forrest stuff at all. I recommend getting a clean sandbox see if the problem is still there -- I have had trouble with my sandboxes from time to time. Or, at least go to the repository see if the files still exist there. want to make changes to the documentation anyway, so if you let me know what the requirements are for getting the forrest site build to work, I can fix the alt.design files. If you would rather get the basics sorted out yourself, let me know. I just committed some additional information to the README that should help a bit. Please feel free to expand it further, or, if it is sufficient, publish away! Victor Mote - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]
RE: src/documentation/README
Jeff Turner wrote: Forrest is in the same boat as FOP when it comes to site updates. AFAIK, there are no docs, but the process is: - Committers commit generated docs to xml-site/targets/{project} - Every X hours, a script updates /www/xml.apache.org/ or wherever on the live site, from CVS. Pretty messy, but this CVS-based site update system has some virtues: - it is pull-based, so fewer security risks Pull wouldn't require CVS. - site contents can be reverted easily without an admin having to figure out how the doc generation tool works. This makes sense. - it's there and it works Discussions for creating a better site update system should probably be held on general@xml, since it affects all projects, or forrest-dev, since people there are particularly interested, and Steven has talked with Sam about it before. I have no problem with xml-site/targets/{project} or the CVS-based system (except for the binary PDF issue mentioned before, which may be unavoidable). What is a little frustrating is that it seems like we are only a short script (checkout, copy -r, commit) away from being able to use the builds that are being done on icarus (or another apache machine) to update xml-site/targets/{project}, instead of having to do the documentation builds locally. Is there some reason why we can't just do a copy here? Victor Mote - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]
RE: dtd catalog (was: src/documentation/README)
Peter B. West wrote: I'm still floundering around here, but when I found 'catalog' in .../documentation/resources/schema, and the dtd in .../schema/dtd, I began to see a ray of light. It seems to me that such a setup should be used for all of the DOCTYPE delcarations in the documentation tree. At the moment we are relying on the system identifier component of the DOCTYPE declaration, and that is indicating a CVS retrieval - some from the xml-forrest base, some from xml-cocoon2, last time I looked. The way this works is that your validation software has to know how to find the catalog. If it does, then the catalog can contain mappings from the PUBLIC IDs in the DOCTYPE declaration to a local physical file. That setup is already in all of our documents. For example, our resources.xml file contains the following DOCTYPE: !DOCTYPE document PUBLIC -//APACHE//DTD Documentation V1.1//EN http://cvs.apache.org/viewcvs.cgi/*checkout*/xml-forrest/src/resources/sche ma/dtd/document-v11.dtd Your catalog has to know how to map -//APACHE//DTD Documentation V1.1//EN to /u/xml-schema/document-v11.dtd or whatever your local file is. What I added several weeks ago was the URIs (yes, they are CVS-based until we find some static URI to use instead) that allow the validation to be done across the internet. This doesn't take away the ability to use a local catalog, but rather makes it no longer a necessity. My understanding is that using URIs is the preferable way to do the validation. O'Reilly's XML in a Nutshell, 2nd edition, page 32, says In practice, however, PUBLIC IDs aren't used very much. Almost all validators rely on the URI to actually validate the document. The only reason to use the catalog and the PUBLIC ID is if you are on a machine that doesn't have suitable net access. These can (probably will) get out of sync. The dtd should be the one used when the document was last modified, shouldn't it? It seems to me there is a case for including the schema subtree, including catalog file(s) and the dtd subdirectory, in the src/build tree, and maintaining the synchronization locally. Keiron I discussed this at the time: http://marc.theaimsgroup.com/?l=fop-devm=103726556406364w=2 http://marc.theaimsgroup.com/?l=fop-devm=103726721907599w=2 http://marc.theaimsgroup.com/?l=fop-devm=103730698919444w=2 and decided against it. Since I did this work, I see that we could use viewcvs to get a specific revision of the file, so we could control this using that method. However, it seems to me that DTDs conventionally have a version number built into their filenames, so I assume that any changes made on those files are of a bug fix nature as opposed to radical changes that would be likely to mess up users of the DTD. It seems to me that we have this set up as well as it can be, but I sure could be missing something. Victor Mote - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]
cvs commit: xml-fop/lib BSF.license.txt bsf.jar
olegt 2002/12/11 14:01:55 Removed: lib BSF.license.txt bsf.jar Log: Removed bsf.jar and its license as unused anymore. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
cvs commit: xml-fop/docs/examples runtests.bat runtests.sh
olegt 2002/12/11 14:13:21 Modified:.fop.bat docs/examples runtests.bat runtests.sh Log: Removed footprints of bsf.jar. Revision ChangesPath 1.8 +0 -1 xml-fop/fop.bat Index: fop.bat === RCS file: /home/cvs/xml-fop/fop.bat,v retrieving revision 1.7 retrieving revision 1.8 diff -u -r1.7 -r1.8 --- fop.bat 22 Nov 2002 18:07:13 - 1.7 +++ fop.bat 11 Dec 2002 22:13:20 - 1.8 @@ -7,7 +7,6 @@ set LOCALCLASSPATH=%LOCALCLASSPATH%;%LIBDIR%\xalan-2.4.1.jar set LOCALCLASSPATH=%LOCALCLASSPATH%;%LIBDIR%\batik.jar set LOCALCLASSPATH=%LOCALCLASSPATH%;%LIBDIR%\avalon-framework-cvs-20020806.jar -set LOCALCLASSPATH=%LOCALCLASSPATH%;%LIBDIR%\bsf.jar set LOCALCLASSPATH=%LOCALCLASSPATH%;%LIBDIR%\jimi-1.0.jar set LOCALCLASSPATH=%LOCALCLASSPATH%;%LIBDIR%\jai_core.jar set LOCALCLASSPATH=%LOCALCLASSPATH%;%LIBDIR%\jai_codec.jar 1.13 +0 -1 xml-fop/docs/examples/runtests.bat Index: runtests.bat === RCS file: /home/cvs/xml-fop/docs/examples/runtests.bat,v retrieving revision 1.12 retrieving revision 1.13 diff -u -r1.12 -r1.13 --- runtests.bat 22 Nov 2002 18:06:45 - 1.12 +++ runtests.bat 11 Dec 2002 22:13:21 - 1.13 @@ -14,7 +14,6 @@ set LOCALCLASSPATH=%LOCALCLASSPATH%;%LIBDIR%\xalan-2.4.1.jar set LOCALCLASSPATH=%LOCALCLASSPATH%;%LIBDIR%\batik.jar set LOCALCLASSPATH=%LOCALCLASSPATH%;%LIBDIR%\avalon-framework-cvs-20020806.jar -set LOCALCLASSPATH=%LOCALCLASSPATH%;%LIBDIR%\bsf.jar set LOCALCLASSPATH=%LOCALCLASSPATH%;%LIBDIR%\jimi-1.0.jar set LOCALCLASSPATH=%LOCALCLASSPATH%;%LIBDIR%\jai_core.jar set LOCALCLASSPATH=%LOCALCLASSPATH%;%LIBDIR%\jai_codec.jar 1.10 +0 -1 xml-fop/docs/examples/runtests.sh Index: runtests.sh === RCS file: /home/cvs/xml-fop/docs/examples/runtests.sh,v retrieving revision 1.9 retrieving revision 1.10 diff -u -r1.9 -r1.10 --- runtests.sh 22 Nov 2002 18:06:45 - 1.9 +++ runtests.sh 11 Dec 2002 22:13:21 - 1.10 @@ -20,7 +20,6 @@ LOCALCLASSPATH=$LOCALCLASSPATH:$LIBDIR/xalan-2.4.1.jar LOCALCLASSPATH=$LOCALCLASSPATH:$LIBDIR/batik.jar LOCALCLASSPATH=$LOCALCLASSPATH:$LIBDIR/avalon-framework-cvs-20020806.jar -LOCALCLASSPATH=$LOCALCLASSPATH:$LIBDIR/bsf.jar LOCALCLASSPATH=$LOCALCLASSPATH:$LIBDIR/jimi-1.0.jar LOCALCLASSPATH=$LOCALCLASSPATH:$LIBDIR/jai_core.jar LOCALCLASSPATH=$LOCALCLASSPATH:$LIBDIR/jai_codec.jar - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: dtd catalog (was: src/documentation/README)
Victor Mote wrote: Peter B. West wrote: I'm still floundering around here, but when I found 'catalog' in .../documentation/resources/schema, and the dtd in .../schema/dtd, I began to see a ray of light. It seems to me that such a setup should be used for all of the DOCTYPE delcarations in the documentation tree. At the moment we are relying on the system identifier component of the DOCTYPE declaration, and that is indicating a CVS retrieval - some from the xml-forrest base, some from xml-cocoon2, last time I looked. The way this works is that your validation software has to know how to find the catalog. If it does, then the catalog can contain mappings from the PUBLIC IDs in the DOCTYPE declaration to a local physical file. That setup is already in all of our documents. For example, our resources.xml file contains the following DOCTYPE: !DOCTYPE document PUBLIC -//APACHE//DTD Documentation V1.1//EN http://cvs.apache.org/viewcvs.cgi/*checkout*/xml-forrest/src/resources/sche ma/dtd/document-v11.dtd Your catalog has to know how to map -//APACHE//DTD Documentation V1.1//EN to /u/xml-schema/document-v11.dtd or whatever your local file is. What I added several weeks ago was the URIs (yes, they are CVS-based until we find some static URI to use instead) that allow the validation to be done across the internet. This doesn't take away the ability to use a local catalog, but rather makes it no longer a necessity. My understanding is that using URIs is the preferable way to do the validation. O'Reilly's XML in a Nutshell, 2nd edition, page 32, says In practice, however, PUBLIC IDs aren't used very much. Almost all validators rely on the URI to actually validate the document. The only reason to use the catalog and the PUBLIC ID is if you are on a machine that doesn't have suitable net access. Norm Walsh has been campaigning for catalogs for a while now. E.g. http://wwws.sun.com/software/xml/developers/resolver/article/. I don't have the luxury of permanent net access. Neither do you if cvs.apache.org is down or your access to the net is cut for any reason. I think many apache developers would be in the situation that their work on open source must take place _away_ from their employment environment, and many of these would not have broadband or other permanent access. I think that factor is worth considering. Peter -- Peter B. West [EMAIL PROTECTED] http://www.powerup.com.au/~pbwest/ Lord, to whom shall we go? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]
Today's the day...
Fopsters, Today (it's Thursday, my time) we hear what Tony Graham has been up to. I don't suppose anyone is in Baltimore? Peter -- Peter B. West [EMAIL PROTECTED] http://www.powerup.com.au/~pbwest/ Lord, to whom shall we go? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]
Re: Alt-Design: Preliminary results FO tree build test
Keiron Liddle wrote: The only questions I have at the moment: - are markers handled properly, you mentioned something about that earlier so maybe it is dealt with already - what about arbitrary xml anywhere for extensions, is that still possible (also instream-foreign-object but that is probably okay) I know it is not a spec thing but it can enhance using FOP for many users. Marker handling is deferred until area tree construction. Not all of the FOs that can have markers have been fitted with handling yet, but the model for it at the FO tree building level is as follows (from FoListBlock.java). The table-row comment, which I just noticed, is a hang-over from another FO. /** The number of markers on this FO. */ private int numMarkers = 0; /** The number of list-items on this FO. */ private int numItems = 0; /** The offset of 1st table-row within the children. */ private int firstItemOffset = -1; while ((ev = xmlevents.expectStartElement (FObjectNames.MARKER, XMLEvent.DISCARD_W_SPACE)) != null) { new FoMarker(getFOTree(), this, ev, stateFlags); numMarkers++; ev = xmlevents.getEndElement(xmlevents.DISCARD_EV, ev); pool.surrenderEvent(ev); } There is no provision for extension elements, apart from the keeping track of incoming elements with variant namespace declarations. In terms of the inherent input validation of pull parsing, the checking of foreign namespace elements could be inserted in the get/expect processing of the FOs. The FO generation is already generalised (most Fo? elements are nto named in the code, and are generated by the makeFlowObject() method of FObjects, so generalising the validation of foreign elements should be feasible. The semantics of such objects is always going to be more of a problem. Peter -- Peter B. West [EMAIL PROTECTED] http://www.powerup.com.au/~pbwest/ Lord, to whom shall we go? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]
Re: Getting breaks: revisited
J.Pietschmann wrote: Peter B. West wrote: ...the intention of the spec would be realised by laying out 0 of the repeatable-p-m-refs thin, out of the available range of 0-100, then laying out 1 of the thick r-p-m-refs. Interesting and useful interpretation. The problem is, how to implement this? Joerg, It depends on your overall method of generating areas. This goes to overall design questions, which are on hold until we have had a chance to consider the Sun product, but if layout is driven from below (as in a sense it already is), with tentative layouts bubbling upwards until they strike an invalidating constraint, which then follows them back down the tree for a retry, and all of this eventually comes back up to the PageMaker level, then, in the case above, the result would be a no-can-do, accompanied by the dimensions of the best attempt. The PageMaker would then look for a layout master alternative, discarding the remainder of the repetitions in the process, and having a go at the thick master. That succeeds. If it didn't the fallback mode would be determined by the overflow property. Peter -- Peter B. West [EMAIL PROTECTED] http://www.powerup.com.au/~pbwest/ Lord, to whom shall we go? - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]
Re: dtd catalog (was: src/documentation/README)
On Wed, Dec 11, 2002 at 12:59:03PM -0700, Victor Mote wrote: Peter B. West wrote: I'm still floundering around here, but when I found 'catalog' in .../documentation/resources/schema, and the dtd in .../schema/dtd, I began to see a ray of light. It seems to me that such a setup should be used for all of the DOCTYPE delcarations in the documentation tree. At the moment we are relying on the system identifier component of the DOCTYPE declaration, and that is indicating a CVS retrieval - some from the xml-forrest base, some from xml-cocoon2, last time I looked. The way this works is that your validation software has to know how to find the catalog. If it does, then the catalog can contain mappings from the PUBLIC IDs in the DOCTYPE declaration to a local physical file. That setup is already in all of our documents. For example, our resources.xml file contains the following DOCTYPE: !DOCTYPE document PUBLIC -//APACHE//DTD Documentation V1.1//EN http://cvs.apache.org/viewcvs.cgi/*checkout*/xml-forrest/src/resources/sche ma/dtd/document-v11.dtd Your catalog has to know how to map -//APACHE//DTD Documentation V1.1//EN to /u/xml-schema/document-v11.dtd or whatever your local file is. FTR, Forrest has a built-in catalog, so the SYSTEM id of docs processed by Forrest will be ignored. If FOP has any DTDs not included in Forrest, then these can be added to src/documentation/resources/schema/dtd, and a project-specific catalog added. This process is documented at http://xml.apache.org/forrest/validation.html It seems to me that we have this set up as well as it can be Looks like it. --Jeff but I sure could be missing something. Victor Mote - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]
Automatic website update
Hi all, Sam Ruby has added the FOP website to his script which updates daedalus (xml.apache.org) from icarus (cvs.apache.org) every 6 hours, starting at midnight Pacific Time (where daedalus is hosted). Christian - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]
Re: src/documentation/README
On Wed, Dec 11, 2002 at 03:15:33PM +0100, Nicola Ken Barozzi wrote: ... [On why http://forrestbot.cocoondev.org/sites/xml-fop/ wasn't updating] ... [java] Exception in thread main java.lang.InternalError: Can't connect to X11 window server using ':0.0' as the value of the DISPLAY variable. I added a DISPLAY pointing to Xvfb, and it seems to be updating properly now. If anyone notices a 'Last published' date at the bottom of a page older than 1 hour (GMT) please let me know. --Jeff - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]
Croatian characters in fop created pdf
Hi I have created I java class that takes xml file and xsl file and creates pdf. Now my xml has croatian character and those are showen as #. I read documentation and it says that fonts must be avaliable on the platform. Now I don't understand this since I have croatina locale set and I can view pdf files that have croatian characters. I have tried to use different font familises for fo:block tag. (Arial, Symbol etc ...) What should I do to make it work. If you can explain it to me step by step I would be very thankful. My method is public void createPdf(String xml, String xslPath, String outputPdfPath) throws TransformerConfigurationException, TransformerException, FileNotFoundException, FOPException, IOException { Logger log = new ConsoleLogger(ConsoleLogger.LEVEL_WARN); MessageHandler.setScreenLogger(log); FileOutputStream fos=new FileOutputStream(outputPdfPath); //Options options = new Options(new File(c:\\Adis\\MedicSoft\\MedicsoftClient\\GUI\\config\\print\\userconfig.xml)); Driver driver=new Driver(); driver.setLogger(log); driver.setOutputStream(fos); driver.setRenderer(Driver.RENDER_PDF); StreamSource xmlStreamSource=new StreamSource(new StringReader(xml)); Transformer transformer=TransformerFactory.newInstance().newTransformer(new StreamSource(xslPath)); transformer.transform(xmlStreamSource, new SAXResult(driver.getContentHandler())); fos.close(); } and xsl that is used for fop is very easy ?xml version=1.0 encoding=UTF-8? xsl:stylesheet version=1.0 xmlns:xsl=http://www.w3.org/1999/XSL/Transform; xmlns:fo=http://www.w3.org/1999/XSL/Format; xsl:template match=/ fo:root xmlns:fo=http://www.w3.org/1999/XSL/Format; fo:layout-master-set fo:simple-page-master master-name=simple page-height=29.7cm page-width=21cm margin-top=1cm margin-bottom=2cm margin-left=2.5cm margin-right=2.5cm fo:region-body margin-top=3cm/ fo:region-before extent=3cm/ fo:region-after extent=1.5cm/ /fo:simple-page-master /fo:layout-master-set fo:page-sequence master-reference=simple fo:flow flow-name=xsl-region-body xsl:apply-templates select=data/ /fo:flow /fo:page-sequence /fo:root /xsl:template xsl:template match=data fo:block xsl:apply-templates select=name/ xsl:apply-templates select=description/ /fo:block /xsl:template xsl:template match=name fo:block font-size=18pt font-family=sans-serif line-height=24pt space-after.optimum=15pt background-color=blue color=white text-align=center padding-top=3pt xsl:value-of select=./ /fo:block /xsl:template xsl:template match=description fo:block font-size=12pt font-family=Symbol line-height=15pt space-after.optimum=3pt text-align=justify xsl:value-of select=./ /fo:block /xsl:template /xsl:stylesheet _ Add photos to your e-mail with MSN 8. Get 2 months FREE*. http://join.msn.com/?page=features/featuredemail - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]