Ok. Found the issue.

Change was requied in the java code.

BufferedWriter out = *new* BufferedWriter(*new* OutputStreamWriter(
*new*FileOutputStream(strDirectory+
"/"+sheet_name+".out"), "UTF8"));

Now its working both on Windows and Unix.

Cheers,
Parag


On Sat, Aug 28, 2010 at 10:49 AM, Parag Kalra <[email protected]> wrote:

>
> Hmmm...
>
> The strange thing is that on the same OS (Solaris), if I manually enter a
> Unicode character in a text file, I am being able to read it using editors
> like vi or cat
>
> EG:
>
> bash-3.00$ echo -e "\340" > unicode.out ; cat unicode.out
> à
> bash-3.00$
> Hence thought that it could be an issue with Apache POI or may be the java
> version of te operating system.
> Cheers,
> Parag
>
>   On Sat, Aug 28, 2010 at 10:38 AM, Nick Burch <[email protected]>wrote:
>
>>   On Sat, 28 Aug 2010, Parag Kalra wrote:
>>
>>> By international characters, I mean multibyte characters like - à
>>>
>>
>> POI handles unicode characters in all the file formats just fine - it's a
>> java program, and does the right thing about converting what's in the files
>> into unicode in the JVM.
>>
>> You'll want to go and learn how to configure your OS properly for unicode.
>> Once you've done that, it'll all be fine, be it Linux, AIX, OSX, Windows or
>> whatever.
>>
>> Nick
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: [email protected]
>> For additional commands, e-mail: [email protected]
>>
>
>

Reply via email to