jason hadoop wrote:

> How about new InputStreamReader( new StringReader( String ), "UTF-8" )
> replace UTF-8 with an appropriate charset.
>
>
> On Tue, Apr 28, 2009 at 7:47 PM, nguyenhuynh.mr 
> <nguyenhuynh...@gmail.com>wrote:
>
>   
>> Hi all!
>>
>>
>> I have the large String and I want to write it into the file in HDFS.
>>
>> (The large string has >100.000 lines.)
>>
>>
>> Current, I use method copyBytes of class org.apache.hadoop.io.IOUtils.
>> But the copyBytes request the InputStream of content. Therefore, I have
>> to convert the String to InputStream, some things like:
>>
>>
>>
>>    InputStream in=new ByteArrayInputStream(sb.toString().getBytes());
>>
>>    The "sb" is a StringBuffer.
>>
>>
>> It not work with the command line above. :(
>>
>> There is the error:
>>
>> Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
>>    at java.lang.StringCoding$StringEncoder.encode(StringCoding.java:232)
>>    at java.lang.StringCoding.encode(StringCoding.java:272)
>>    at java.lang.String.getBytes(String.java:947)
>>    at asnet.haris.mapred.jobs.Test.main(Test.java:32)
>>
>>
>>
>> Please give me the good solution!
>>
>>
>> Thanks,
>>
>>
>> Best regards,
>>
>> Nguyen,
>>
>>
>>
>>
>>     
>
>
>   
Thanks for your answer!

Best,
Nguyen,

Reply via email to