If there isn't a clean translation to and from a string, then you can wrap an 
ObjectOutputStream around a FSOutputStream to HDFS and use the DistributedCache 
to localize it.  You task could then read it using java's ObjectInputStream and 
FileInputStream.

On Jan 26, 2011, at 11:00 PM, Joan <[email protected]> wrote:

> Hi Li,
> 
> Ye I agree but my object implements serializable but how to indicate it in 
> hadoop's configuration?
> 
> Joan
> 
> 2011/1/27 li ping <[email protected]>
> yes, I agree. The configuration parameter should be a serialize-able value. 
> because the parameter will be transfer to other node to run the job. If the 
> value can not be serialized, I believe it should be a problem.
> 
> 
> On Wed, Jan 26, 2011 at 11:52 PM, David Rosenstrauch <[email protected]> 
> wrote:
> On 01/26/2011 05:43 AM, Joan wrote:
> Hi,
> 
> I'm trying set Object into Hadoop's configuration but I don't know how to.
> 
> I'm want to do:
> 
> org.apache.hadoop.conf.Configuration conf = new
> org.apache.hadoop.conf.Configuration();
> 
> conf.set("txt",myObject);
> 
> But It doesn't exist method like: conf.set(String,Object)
> 
> Someone know how to set custom objet into hadoop configuration?
> 
> Thanks
> 
> Joan
> 
> 
> The object needs to be a String or something that is easily convertible to a 
> String (e.g., integer).  This is so it can be serialized and sent across the 
> cluster.
> 
> DR
> 
> 
> 
> -- 
> -----李平
> 

Reply via email to