Hi
its on Labs now.
http://mediawiki2latex.wmflabs.org/
The only problem is that only about five people a day are using it. Its
good on the one hand since the server is small. On the other hand the
might be some people who would like to use it but still don't find it.
So if anybody got an
On 12/11/2013 01:36 AM, C. Scott Ananian wrote:
Could you take a look at the attached PDF, generated from
https://ml.wikipedia.org/wiki/%E0%B4%AE%E0%B4%B2%E0%B4%AF%E0%B4%BE%E0%B4%B3%E0%B4%82
with our not-yet-deployed new software? Any Malayam-specific feedback
you could provide would be very
Sure, I'd love to look at your code. Hopefully we can avoid
reinventing the wheel *too* many times. Is it available some where?
Or a written report?
--scott
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
On Mon, Nov 25, 2013 at 10:49 PM, Santhosh Thottingal
santhosh.thottin...@gmail.com wrote:
To support complex
scriptshttps://en.wikipedia.org/wiki/Complex_text_layout we
need to use a tex system that can support Unicode and complex script
rendering system. Xetex
On Tue, Dec 10, 2013 at 3:06 PM, C. Scott Ananian
canan...@wikimedia.org wrote:
Could you take a look at the attached PDF, generated from
https://ml.wikipedia.org/wiki/%E0%B4%AE%E0%B4%B2%E0%B4%AF%E0%B4%BE%E0%B4%B3%E0%B4%82
with our not-yet-deployed new software? Any Malayam-specific feedback
Hi Scott,
I saw you started to work on an LaTeX Export yourself. I needed more
than 3 year for mine. So I want you to be aware that it might take you a
long time to come up with something that really works. I also want to
offer you to share all my experience with you if you decide to do it
The new PDF rendering pipeline does indeed use XeLaTeX. I haven't
used it to typeset non-latin scripts since a summer I spent at SIL in
1996 (and that might have been Omega, not XeLaTeX), so if you wanted
to pitch in and help out I'd greatly appreciate it. To start with,
short example LaTeX
The new PDF rendering pipeline includes a new wikitext to latex
converter, based on the Parsoid parser. You might want to check out:
https://git.wikimedia.org/summary/mediawiki%2Fextensions%2FCollection%2FOfflineContentGenerator%2Fbundler
and
To support complex
scriptshttps://en.wikipedia.org/wiki/Complex_text_layout we
need to use a tex system that can support Unicode and complex script
rendering system. Xetex https://en.wikipedia.org/wiki/XeTeX works very
well with these scripts.I tried the MediaWiki to Latex converter with
Malayalam
Why not set it up on Labs? :)
On 17 November 2013 20:45, Dirk Hünniger dirk.hunni...@googlemail.comwrote:
Hello,
I also put up a web version of the mediawiki to latex converter.
http://mediawiki2latex.mooo.com/
The machine it is running on is really slow (like an intel atom)
Yours Dirk
Hello,
I also put up a web version of the mediawiki to latex converter.
http://mediawiki2latex.mooo.com/
The machine it is running on is really slow (like an intel atom)
Yours Dirk
On 12.11.2013 13:09, Fred Bauder wrote:
I have a log of what happens on when the commands:
sudo apt-get
I have a log of what happens on when the commands:
sudo apt-get install mediawiki2latex
mediawiki2latex -u https://en.wikipedia.org/wiki/Adam_Ries -o AdamRies.pdf
are entered on the command line of ubuntu (13.10) Better than TV...
Happy to send it to anyone.
Fred
I'm not on Ubuntu 13.10 (still running the LTS of 12.04); but I tried
downloading and installing the cabal manually just to try it out. I'm
getting the following error from installing the cabal in
mediawiki2latex-7.1.tar.gz; any thoughts? (Also for some reason the build
tells me that it's
Hello,
I downloaded from
http://sourceforge.net/projects/wb2pdf/files/mediawiki2latex/7.1/
and simply ran make. This did not trigger cabal.
You also have to install the build dependencies before you compile
Build-Depends: debhelper (= 8), ghc,
libghc-regex-compat-dev, libghc-http-dev,
Hello,
in the current version of ubuntu (13.10) you can compile pages from
MediaWiki to LaTeX using the commands:
sudo apt-get install mediawiki2latex
mediawiki2latex -u https://en.wikipedia.org/wiki/Adam_Ries -o AdamRies.pdf
Yours Dirk
___
Hi! Well, I was thinking that maybe both formats could coexist in the same
project/web, and as soon as somebody wants to export it, it would be
possible to convert the wikitext into latex and export both.
No idea if it would have any practical use, though.
Btw, great music! :)
Micru
On Tue,
quote name=Dirk Hünniger date=2013-08-04 time=10:05:35 +0200
Hello,
I made a new debian package, which resolves the security issues you
mentioned.
It is available here:
http://sourceforge.net/projects/wb2pdf/files/mediawiki2latex/6.5/
Yours Dirk
I had to use a mailing list archive to see
I think this is going to be great news for WorkingWiki
https://www.mediawiki.org/wiki/Extension:WorkingWiki
Cheers,
Micru
On Tue, Aug 13, 2013 at 3:57 PM, Greg Grossmeier g...@wikimedia.org wrote:
quote name=Dirk Hünniger date=2013-08-04 time=10:05:35 +0200
Hello,
I made a new debian
Hi Micru!
What use case do you have in mind?
The main use of latex in WorkingWiki is simply authoring .tex files
directly...
Lee Worden
p.s. I meant to post the What is WorkingWiki video here that I put on
twitter during Wikimania: http://www.youtube.com/watch?v=D8pwr-Uizf4
Date: Tue,
Hello,
I made a new debian package, which resolves the security issues you
mentioned.
It is available here:
http://sourceforge.net/projects/wb2pdf/files/mediawiki2latex/6.5/
Yours Dirk
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
You mean the mega font? That's actually 207M uncompressed :)
That should probably go to a different package (and depend on it). I
don't see why it couldn't fallback to another available font if it's not
available, though.
I could indeed work without that font. But in this case I will create
font
You mean the mega font? That's actually 207M uncompressed :)
That should probably go to a different package (and depend on it). I
don't see why it couldn't fallback to another available font if it's not
available, though.
The point is that the change of the font has to happen inside a run of
Hugo Vincent hugo at bluewatersys.com writes:
Hi everyone,
I recently set up a MediaWiki (http://server.bluewatersys.com/w90n740/)
and I need to extra the content from it and convert it into LaTeX
syntax for printed documentation. I have googled for a suitable OSS
solution but nothing
On 16 June 2012 10:51, Dirk Hünniger hunni...@cip.physik.uni-bonn.de wrote:
This problem is actually sovled there is an easy way to export mediawiki
articles to LaTeX and PDF.
see http://de.wikibooks.org/wiki/Benutzer:Dirk_Huenniger/wb2pdf
Interesting, but why is it so large? Is the source
On 06/16/2012 12:03 PM, Svip wrote:
Interesting, but why is it so large? Is the source code available?
The source code is available here
http://wb2pdf.svn.sourceforge.net/viewvc/wb2pdf/
The Binary is large because it contains everything necessery to compile
the generated LaTeX code, which
On 16/06/12 10:51, Dirk Hünniger wrote:
This problem is actually sovled there is an easy way to export mediawiki
articles to LaTeX and PDF.
see http://de.wikibooks.org/wiki/Benutzer:Dirk_Huenniger/wb2pdf
Yours Dirk Hünniger
How does it compare with
On 06/16/2012 05:53 PM, Platonides wrote:
On 16/06/12 10:51, Dirk Hünniger wrote: This problem is actually sovled there is an easy way
to export mediawiki articles to LaTeX and PDF.see
http://de.wikibooks.org/wiki/Benutzer:Dirk_Huenniger/wb2pdfYours Dirk Hünniger
How does it compare
On 16/06/12 12:25, Dirk Hünniger wrote:
On 06/16/2012 12:03 PM, Svip wrote:
Interesting, but why is it so large? Is the source code available?
The source code is available here
http://wb2pdf.svn.sourceforge.net/viewvc/wb2pdf/
The Binary is large because it contains everything necessery
On 06/16/2012 06:49 PM, Platonides wrote:
Have you heard of dependencies?You have to download a 364M file, which extracts
to 898MOf those 94M are Linux-specific. The rest includes miktex files,
objectfiles, dlls, exes, imagemagick, tcl/tk, Olson db...The real code seem to
lie at
On 16/06/12 19:14, Dirk Hünniger wrote:
On 06/16/2012 06:49 PM, Platonides wrote:
Have you heard of dependencies?You have to download a 364M file, which
extracts to 898MOf those 94M are Linux-specific. The rest includes
miktex files, objectfiles, dlls, exes, imagemagick, tcl/tk, Olson
30 matches
Mail list logo