Rami Friedman <[EMAIL PROTECTED]> writes:
>I have a file that is encoded in utf-8. When I read it into a Java string
>and write it to the database, it gets written properly, but I have problems
>when I try to do the same thing in perl. I can read the file in (and if I
>send it out through a cgi
I have a file that is encoded in utf-8. When I read it into a Java string
and write it to the database, it gets written properly, but I have problems
when I try to do the same thing in perl. I can read the file in (and if I
send it out through a cgi, the characters display properly in a browser
Yes you should have NLS_LANG=american_america.UTF8
where the language and territory are whatever the database was originally set
up in as this is the language that messages from the database will be returned
in.
Ken Shan wrote:
> On 2000-12-14T11:55:57-0800, Rami Friedman wrote:
> > Could I inst
On 2000-12-14T11:55:57-0800, Rami Friedman wrote:
> Could I instead rely on the database driver to convert from the
> foreign charset to unicode?
I don't see why not. Have you looked at the Oracle documentation for
its NLS support? The Oracle driver should be happy to perform the
encoding conve
Rami Friedman <[EMAIL PROTECTED]> writes:
>I need to read files written in a variety of charsets (Big5, Arabic,
>Hebrew, etc) and write their contents to an oracle database. This
>problem is easy to solve in Java where each feed gets converted to a
>ucs-2 string, but, if possible, I need to write