befor I copy  the hdfs-site.xml,mapred-site.xml and yarn-site.xml of the 
cluster configuration and put that in your eclipse classpath I get the error 
“core.xml not found.....”
after I copy I get  another  error “JobControl (not using hadoop 0.20 ?)”  I 
dont know why?    do you have done the connection








At 2014-12-20 01:47:08, "Rohini Palaniswamy" <rohini.adi...@gmail.com> wrote:
>You don't have to do that. You just need to copy the hdfs-site.xml,
>mapred-site.xml and yarn-site.xml of the cluster configuration and put that
>in your eclipse classpath.
>
>On Thu, Dec 18, 2014 at 6:09 PM, 李运田 <cumt...@163.com> wrote:
>>
>> hi all.
>> I want to use pig in eclipse.my hadoop(yarn) cluster and eclipse are in
>> the same linux cluster .my pig configuration  in eclipse::
>>
>>  Properties props = new Properties();
>>      props.setProperty("fs.defaultFS", "hdfs://10.210.90.*:8020");
>>      props.setProperty("hadoop.job.user", "hadoop");
>>      props.setProperty("mapreduce.framework.name", "yarn");
>>      props.setProperty("yarn.resourcemanager.hostname", "10.210.90.*");
>>      props.setProperty("yarn.resourcemanager.admin.address",
>> "10.210.90.*:8141");
>>         props.setProperty("yarn.resourcemanager.address",
>> "10.210.90.*:8050");
>>      props.setProperty("yarn.resourcemanager.resource-tracker.address",
>> "10.210.90.*:8025");
>>      props.setProperty("yarn.resourcemanager.scheduler.address",
>> "10.210.90.*:8030");
>>
>>
>> but,it  is not connected. I dont know how I can configure the pig in
>> eclipse?
>> can you help me? please

Reply via email to