Servus Volker,

the cdbwiki (mw-buildcdb) contains all articles. mw-render just doesn´t
include them. environment.wiki.metabook is empty from beginning to the
end. Can I specify a metabook to parse all articles? What is a metabook
and what should it look like? 

Then I have one more question. Is it possible to rename or translate
articles by specifying a metabook?


Best wishes,

Timo


Am Donnerstag, den 30.06.2011, 11:31 +0200 schrieb Volker Haas:
> I don't really know anything about cdbwiki - it might be completely 
> broken and not compatible anymore with the current writers (pdf etc.). I 
> suspect that one of two things might have happend: "importing" the 
> xmldump failed and the cdbwiki is essentially empty. The other option is 
> that iterating over the contained articles failed, therefore it only 
> seems empty.
> 
> Sorry, I have no idea what is wrong or how to help,
> Volker
> 
> On 06/17/2011 02:42 PM, Timo wrote:
> > No they are not. rl outputs a pdf with one page where it says „Article
> > Sources and contributors“.
> >
> >
> > Am Freitag, den 17.06.2011, 10:05 +0200 schrieb Volker Haas:
> >> Can you please check if all articles are rendered if you export to PDF
> >> (writer: rl)
> >>
> >> On 06/16/2011 11:58 AM, Timo wrote:
> >>> When I use mw-render -w xhtml the same problem occurs. It doesnt output
> >>> any articles. Only this. Is there any writer which can output all
> >>> articels?
> >>>
> >>> <?xml version="1.0" encoding="UTF-8"?>
> >>> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
> >>> "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd";>
> >>> <html xml:lang="en"
> >>> xmlns="http://www.w3.org/1999/xhtml";><head><title /><style
> >>> media="screen, projection" type="text/css">
> >>>                           @import
> >>> "http://en.wikipedia.org/skins-1.5/common/shared.css?156";;
> >>>                           @import
> >>> "http://en.wikipedia.org//skins-1.5/monobook/main.css?156";;
> >>> </style></head><body><div class="mwx.collection" /></body></html>
> >>>
> >>>
> >>> Am Mittwoch, den 15.06.2011, 23:56 +0200 schrieb Timo:
> >>>> Hi Volker,
> >>>>
> >>>> I am parsing an xml-dump. This is being treated as nucdb (wiki.py line
> >>>> 25). I want to export to docbook. This doesnt take all articles although
> >>>> it should. Which output formats are supported? I consider extending
> >>>> mwlib.
> >>>>
> >>>>
> >>>> Regards,
> >>>>
> >>>> Timo
> >>>>
> >>>> Am Dienstag, den 14.06.2011, 15:00 +0200 schrieb Volker Haas:
> >>>>> On 06/10/2011 04:04 PM, Timo wrote:
> >>>>>> Dear everybody,
> >>>>>>
> >>>>>> who wrote nuwiki.py?
> >>>>> numerous people:
> >>>>> http://code.pediapress.com/git/mwlib?p=mwlib;a=history;f=mwlib/nuwiki.py;h=1aeeca3e0076becb089bab06f2fec8d57c7041d4;hb=HEAD
> >>>>>
> >>>>>> In lines 39-45 the program tries to load
> >>>>>> information from jsonfiles. I dont have those for my wiki.
> >>>>> Those json files are usually never exposed to the "user". And if I am
> >>>>> not totally mistaken, the user should probably never poke around 
> >>>>> nuwiki.py.
> >>>>>
> >>>>> Depending on the usecase you might want to create a wiki object from a
> >>>>> zip file (I am totally guessing here). This would work like so:
> >>>>>
> >>>>> from mwlib import wiki
> >>>>> w = wiki.makewiki('/path/to/zip/file.zip')
> >>>>>
> >>>>>>     Should any of
> >>>>>> these json files contain information about the acticle names?
> >>>>>> When I dont call mw-render with articlenames, the output will be empty.
> >>>>>> Maybe somebody can try to help me find the reason? :)
> >>>>>>
> >>>>> I don't really understand what you are trying to do. But from earlier
> >>>>> post I infer that some of the below assumptions might be true:
> >>>>>
> >>>>> a) you are using mwlib in windows
> >>>>> b) you are trying to use cdb files as input to mw-render
> >>>>> c) you are trying to export to docbook
> >>>>>
> >>>>> a) is not supported and b) and c) aren't actively maintained anymore.
> >>>>> Therefore I can not really help you much at all, sorry.
> >>>>>
> >>>>> Regards,
> >>>>> Volker
> >>>>>
> >>>>>
> >>>>> -- 
> >>>>> volker haas                 brainbot technologies ag
> >>>>> fon +49 6131 2116394        boppstraße 64
> >>>>> fax +49 6131 2116392        55118 mainz
> >>>>> [email protected]    http://www.brainbot.com/
> >>>>>
> >> -- 
> >> volker haas                 brainbot technologies ag
> >> fon +49 6131 2116394        boppstraße 64
> >> fax +49 6131 2116392        55118 mainz
> >> [email protected]    http://www.brainbot.com/
> >>
> >
> 
> -- 
> volker haas                 brainbot technologies ag
> fon +49 6131 2116394        boppstraße 64
> fax +49 6131 2116392        55118 mainz
> [email protected]    http://www.brainbot.com/
> 


-- 
You received this message because you are subscribed to the Google Groups 
"mwlib" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/mwlib?hl=en.

Reply via email to