Jacob Singh wrote:

> What is the common framework people use for I18N on your sites?  John
> Coggenshall has an article in PHPBuilder about using smarty filters. I
> don't really approve of this approach because it is forcing me into
> Smarty, which I am not particularly fond of.
> 
> I like the look of PEAR::translation2, but I am not sure about the best
> way to implement it.  I feel that a good I18N package, like any other
> package, doesn't compromise your framework intentions.  This one seems
> to require that you use PEAR:DB through their connection, which is a
> problem because of connection pooling, and the fact that I don't use
> PEAR:DB, I am using propel.
> 
> Any thoughts on this?  I need to make a site that is UTF-8 and has
> translations not only for labels and images, but in many cases for
> actual data.
> 
I looked at several different ways of doing things; PHP's Gettext extension,
PEAR's Translation and Translation2, and IntSmarty.

After using Smarty in a fairly large project, I will never use it again, so
IntSmarty was out.

PHP's gettext was my next choice - it's the fastest - but it didn't work out
for a few reasons.

- Hard to search for translated strings
- Hard to set up
- Not easy to use with a site that needs to store content in a database and
translate it as well

PEAR's File_Gettext package helps a bit, but I went with Translation2 in the
end. It's reasonably fast, and client-side caching helps a lot in this
regard. I was using PEAR::DB already, so I went with that as a container.

Translation2 doesn't require that you use PEAR::DB. It has a number of
containers, and you can pretty easily write your own to wrap your DB layer
if you like.

So, to summarize, I'm using:

- PEAR::DB (with PostgreSQL)
- PEAR::Translation2
- PEAR::I18Nv2
- PEAR::HTTP / HTTP_Header / HTTP_Header_Cache

I'm using UTF-8.

I'm using the HTTP packages for cache control and HTTP's negotiateLanguage()
method. I18Nv2 has a Negotiator class, but I was already using PEAR::HTTP,
so I went with it's method since it would be less code to parse.

I have an __() method which takes the string to translate as an argument and
outputs it's translation for whatever the currently selected language is. I
then just wrap any strings in __(), just like I would do with Gettext.


> I'm thinking of storing my data in an XML format in MySQL with multiple
> translations and making my own search index for each language.  The
> problem with this is that I have to grab the entire XML doc for each
> field which may have 10-15 translations, parse and then display, wasting
> lots of processing and database time.
> 
> I'm not farmilliar with XML databases, and I'm told they are bad voodo,
> but what is another solution if you have to store user entered records
> in 'n' languages?
> 
The approach that phpBB (and other PHP projects) use is to just store
everything in a big array in an include file. E.g.

langs/en.php:
$text['Hello'] = 'Hello';

langs/de.php
$text['Hello'] = 'Gutentag';

and so forth. This doesn't work well for content stored in a database, and
you end up loading a fairly large amount of stuff into memory even if it's
not used.

Storing stuff in an XML file has (basically) the same disadvantages, except
that it's even harder on your memory usage, since you have to load up an
XML parser to read it.

As for storing localized data, that can be done with Translation2, but it
doesn't seem like it's necessarily the 'right' way to do things. You don't
say what kind of data you're storing, so I'm not sure how you should
approach that. What I do (for images) is turn Apache's MultiViews and let
it sort out which version to send. The same approach can be used for any
type of data, though you will have to make the directory they're in
writable if you want to have them uploaded via a form. The alternative
would be to roll some of your own code to pull the correct version from a
SQL DB.

Hope this helps.

-- 
PHP Internationalization Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to