Am Montag, den 28.05.2007, 19:32 +1000 schrieb John Beckett:
> Sebastian Menge wrote:
> > Find the list (95 entries) here:
> > http://scratchpad.wikia.com/wiki/Errornames
> 
> Thanks for the good start.
> FYI there are a couple of lines with broken links:
> 157 160: 171: Do you know the "g/" and "g?" commands?

That is fixed now. Was a problem with the script that produced the page.

> Yes! In option (b), you have to change every '/' to '__OR__', so
> you may as well change the titles to something good now.

That could be done by a regex.

> Can you readily do something like this: Put each tip in a
> separate file on your disk. Name them tip0001, tip0002, etc.
> 
> Put the list of 1500 tip titles in one file, one title per
> line. Then edit that file to clean up the titles. Then run a
> script to rename each tip to match the cleaned-up title.

One idea was that the editing can be done on the wiki. Just edit the
Errornames page :-)

> > or
> > b) replace '/' by sth like '__OR__' and fix the whole
> >    title later?
> 
> Whatever works, but wouldn't this create a whole bunch of
> problems? I don't understand the internals of wikis but I think
> your suggestion would create 95 tips with URLs that will later
> need to be manually edited. Not so easy, and probably involves
> copying the content from the wiki to a new page, then deleting
> the old page (I guess).

Moving/Renaming is easy in wikis.

> > BTW: For the import I will now use WikipediaFS.
> 
> Wow - amazing.
> 
> How do you get the wiki format files from the VimTips web site?
> If you're going to do the work, I don't need this answered, but
> I'm thinking that you're going to need one of the scripts to
> convert the existing html to wiki format.
> 
> I noticed that Charles Campbell's script does appropriate things
> with common html codes like nonbreaking space. Probably all that
> processing should be done before the files are uploaded?
> 
> With your scheme, you're going to get 1500 tip files on your
> disk. It would be great if you could clean them as much as
> possible before uploading. It would be pretty easy to find all
> html markups and '&' codes when the files are still on your
> disk.

The cool thing with the WikipediaFS is that i can do scripting on the
pages _as if _ they were on my disk!

My general approach:

First i downloaded all tips to my machine. I startet with the script
vimtips.py and hacked it alot. The script uses a perl-module that
converts html to wiki-markup. But I still have problems with some
regexes: stable regexes for 1500 pages are not easy to do. I have to
clean up the script and will post it here, because I guess there are
some regex gurus around ...

Thanks, Sebastian.

Reply via email to