sorry the error keep on, even when i modify the code "offset,filename = line.strip().split("\t")"
At 2013-01-14 09:27:10,springring <springr...@126.com> wrote: >hi, > I find the key point, not the hostname, it is right. >just chang "offset,filename = line.split("\t")" to >"offset,filename = line.strip().split("\t")" >now it pass > > > > > > > >At 2013-01-12 16:58:29,"Nitin Pawar" <nitinpawar...@gmail.com> wrote: >>computedb-13 is not a valid host name >> >>may be if you have local hadoop then you can name refer it with >>hdfs://localhost:9100/ or hdfs://127.0.0.1:9100 >> >>if its on other machine then just try with IP address of that machine >> >> >>On Sat, Jan 12, 2013 at 12:55 AM, springring <springr...@126.com> wrote: >> >>> hi, >>> >>> I modify the file as below, there is still error >>> >>> 1 #!/bin/env python >>> 2 >>> 3 import sys >>> 4 >>> 5 for line in sys.stdin: >>> 6 offset,filename = line.split("\t") >>> 7 file = "hdfs://computeb-13:9100/user/hdfs/catalog3/" + filename >>> 8 print line >>> 9 print filename >>> 10 print file >>> 11 file_obj = open(file) >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> At 2013-01-12 16:34:37,"Nitin Pawar" <nitinpawar...@gmail.com> wrote: >>> >is this correct path for writing onto hdfs? >>> > >>> >"hdfs://user/hdfs/catalog3." >>> > >>> >I don't see the namenode info in the path. Can this cause any issue. Just >>> >making an guess >>> >something like hdfs://host:port/path >>> > >>> >On Sat, Jan 12, 2013 at 12:30 AM, springring <springr...@126.com> wrote: >>> > >>> >> hdfs://user/hdfs/catalog3/ >>> > >>> > >>> > >>> > >>> > >>> >-- >>> >Nitin Pawar >>> >> >> >> >>-- >>Nitin Pawar