192.168.230.130 is your local machine ? Then change /etc/hosts to map
"localhost" to "127.0.0.1"



On Tue, Sep 28, 2010 at 10:22 AM, Ngô Văn Vĩ <ngovi.se....@gmail.com> wrote:
> "192.168.230.130" is IP of my machine
> @JeffZhang: can you explain clearly?
> Thanks
>
> On Tue, Sep 28, 2010 at 8:39 AM, Jeff Zhang <zjf...@gmail.com> wrote:
>
>> It seems you have connected to the right hadoop when you start pig
>> grunt. But connect to the wrong hadoop when you run pig script.
>> Try to search whether there's other configuration files that mess up
>> with your default configuration. And what is machine "192.168.230.130"
>> ?
>>
>>
>> On Tue, Sep 28, 2010 at 9:23 AM, Ngô Văn Vĩ <ngovi.se....@gmail.com>
>> wrote:
>> > have you help me?
>> > i have configuration
>> > *-  bin/pig*
>> > export JAVA_HOME=/home/ngovi/jdk1.6.0_21
>> > export PIG_INSTALL=/home/ngovi/pig-0.7.0
>> > export PATH=$PATH:$PIG_INSTALL/bin
>> > export PIG_HADOOP_VERSION=0.20.2
>> > export PIG_CLASSPATH=/home/ngovi/hadoop-0.20.2/conf/
>> > ....
>> > *- conf/pig.properties*
>> > fs.default.name=hdfs://localhost:9000/
>> > mapred.job.tracker=localhost:9001
>> > # log4jconf log4j configuration file
>> > i run pig that have error
>> >
>> > *- in hadoop-0.20.2/conf*
>> > *core-site.xml*
>> > <configuration>
>> > <property>
>> > <name>fs.default.name</name>
>> > <value>hdfs://localhost:9000</value>
>> > <description>
>> > the name of the default file system
>> > </description>
>> > </property>
>> > </configuration>
>> > *hdfs-site.xml*
>> > <configuration>
>> > <property>
>> > <name>dfs.replication</name>
>> > <value>1</value>
>> > <description>Default block replication </description>
>> > </property>
>> > </configuration>
>> >
>> > *mapred-site.xml*
>> >
>> > <configuration>
>> > <property>
>> > <name>mapred.job.tracker</name>
>> > <value>localhost:9001</value>
>> > <description>
>> > the host and port that the mapreduce job tracker run at
>> > </description>
>> > </property>
>> > </configuration>
>> >
>> > I run pig that have error??
>> > *ng...@master:~/pig-0.7.0$ bin/pig -x mapreduce
>> > 10/09/27 18:16:29 INFO pig.Main: Logging error messages to:
>> > /home/ngovi/pig-0.7.0/pig_1285636589590.log
>> > 2010-09-27 18:16:30,029 [main] INFO
>> > org.apache.pig.backend.hadoop.executionengine.HExecutionEngine -
>> Connecting
>> > to hadoop file system at: hdfs://localhost:9000/
>> > 2010-09-27 18:16:30,347 [main] INFO
>> > org.apache.pig.backend.hadoop.executionengine.HExecutionEngine -
>> Connecting
>> > to map-reduce job tracker at: localhost:9001
>> > grunt> *
>> >
>> >
>> > thanks all
>> >
>> > On Mon, Sep 27, 2010 at 1:14 PM, Alan Gates <ga...@yahoo-inc.com> wrote:
>> >
>> >> Pig is failing to connect to your namenode.  Is the address Pig is
>> trying
>> >> to use (hdfs://master:54310/) correct?  Can you connect using that
>> string
>> >> from the same machine using bin/hadoop?
>> >>
>> >> Alan.
>> >>
>> >>
>> >> On Sep 27, 2010, at 8:45 AM, Ngô Văn Vĩ wrote:
>> >>
>> >>  I run Pig at Hadoop Mode
>> >>> (Pig-0.7.0 and hadoop-0.20.2)
>> >>> have error?
>> >>> ng...@master:~/pig-0.7.0$ bin/pig
>> >>> 10/09/27 08:39:40 INFO pig.Main: Logging error messages to:
>> >>> /home/ngovi/pig-0.7.0/pig_1285601980268.log
>> >>> 2010-09-27 08:39:40,538 [main] INFO
>> >>> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine -
>> >>> Connecting
>> >>> to hadoop file system at: hdfs://master:54310/
>> >>> 2010-09-27 08:39:41,760 [main] INFO  org.apache.hadoop.ipc.Client -
>> >>> Retrying
>> >>> connect to server: master/192.168.230.130:54310. Already tried 0
>> time(s).
>> >>> 2010-09-27 08:39:42,762 [main] INFO  org.apache.hadoop.ipc.Client -
>> >>> Retrying
>> >>> connect to server: master/192.168.230.130:54310. Already tried 1
>> time(s).
>> >>> 2010-09-27 08:39:43,763 [main] INFO  org.apache.hadoop.ipc.Client -
>> >>> Retrying
>> >>> connect to server: master/192.168.230.130:54310. Already tried 2
>> time(s).
>> >>> 2010-09-27 08:39:44,765 [main] INFO  org.apache.hadoop.ipc.Client -
>> >>> Retrying
>> >>> connect to server: master/192.168.230.130:54310. Already tried 3
>> time(s).
>> >>> 2010-09-27 08:39:45,766 [main] INFO  org.apache.hadoop.ipc.Client -
>> >>> Retrying
>> >>> connect to server: master/192.168.230.130:54310. Already tried 4
>> time(s).
>> >>> 2010-09-27 08:39:46,767 [main] INFO  org.apache.hadoop.ipc.Client -
>> >>> Retrying
>> >>> connect to server: master/192.168.230.130:54310. Already tried 5
>> time(s).
>> >>> 2010-09-27 08:39:47,768 [main] INFO  org.apache.hadoop.ipc.Client -
>> >>> Retrying
>> >>> connect to server: master/192.168.230.130:54310. Already tried 6
>> time(s).
>> >>> 2010-09-27 08:39:48,769 [main] INFO  org.apache.hadoop.ipc.Client -
>> >>> Retrying
>> >>> connect to server: master/192.168.230.130:54310. Already tried 7
>> time(s).
>> >>> 2010-09-27 08:39:49,770 [main] INFO  org.apache.hadoop.ipc.Client -
>> >>> Retrying
>> >>> connect to server: master/192.168.230.130:54310. Already tried 8
>> time(s).
>> >>> 2010-09-27 08:39:50,771 [main] INFO  org.apache.hadoop.ipc.Client -
>> >>> Retrying
>> >>> connect to server: master/192.168.230.130:54310. Already tried 9
>> time(s).
>> >>> 2010-09-27 08:39:50,780 [main] ERROR org.apache.pig.Main - ERROR 2999:
>> >>> Unexpected internal error. Failed to create DataStorage
>> >>>
>> >>> Help me??
>> >>> Thanks
>> >>> --
>> >>> Ngô Văn Vĩ
>> >>> Công Nghệ Phần Mềm
>> >>> Phone: 01695893851
>> >>>
>> >>
>> >>
>> >
>> >
>> > --
>> > Ngô Văn Vĩ
>> > Công Nghệ Phần Mềm
>> > Phone: 01695893851
>> >
>>
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>
>
>
> --
> Ngô Văn Vĩ
> Công Nghệ Phần Mềm
> Phone: 01695893851
>



-- 
Best Regards

Jeff Zhang

Reply via email to