Thanks I implemented the the MultiByte::convert_encoding instead, but
it still does not work.

So I opened a new ticket for this problem:
https://trac.habariproject.org/habari-extras/ticket/211

Alexander

On 16 Feb., 13:35, rick c <[email protected]> wrote:
> A ticket in habari-extras regarding the problem would be appreciated.
>
> You can take a look at the WordPress importer that ships with
> Habari.It uses Habari's MultiByte class to re-encode the incoming
> data, in particular, MultiByte::convert_encoding(). This method tries
> to get the encoding of the incoming text, then convert it to utf-8.
> PHP's utf8_encode() is supposed to work only with ISO-8859-1.
>
> Rick
>
> On Feb 15, 7:06 am, Alexander <[email protected]> wrote:
>
>
>
> > Hi,
>
> > I'm trying to import a Drupal blog with the drupalimport plugin. Both
> > the old drupal and the new habari database are in utf8_general_ci.
> > It seems to work fine, apart from the special characters. So I added:
>
> > $post->content = utf8_encode( $post->content );
> > $post->title = utf8_encode( $post->title );
>
> > before
> >                                 $post_array = $post->to_array();
>
> > and
>
> > $comment->content = utf8_encode( $comment->content );
> > $comment->name = utf8_encode( $comment->name );
>
> > before
>
> > $carray = $comment->to_array();
>
> > like mentioned 
> > inhttp://groups.google.com/group/habari-users/browse_thread/thread/a695...
>
> > This fixes my German Umlaut problems, but not the multibyte characters
> > (Japanese Kanji) I have in my database.
>
> > I've read that using mysql_query('set character set utf8;') might
> > solve the problem, but I have not yet been able to make this work.
>
> > Any tip on how to handle the import of Kanji would be greatly
> > appreciated,
> > Alexander

-- 
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/habari-users

Reply via email to