Hi,
I am getting following errors while trying to transfer data from
hdfs to hbase.
Table at hbase:
hbase(main):007:0> describe 'movies'
DESCRIPTION ENABLED
{NAME => 'movies', FAMILIES => [{NAME => 'HBASE_ROW true
_KEY', BLOOMFILTER => 'NONE', REPLICATION_SCOPE =>
'0', COMPRESSION => 'NONE', VERSIONS => '3', TTL =>
'2147483647', BLOCKSIZE => '65536', IN_MEMORY => '
false', BLOCKCACHE => 'true'}, {NAME => 'name', BLO
OMFILTER => 'NONE', REPLICATION_SCOPE => '0', COMPR
ESSION => 'NONE', VERSIONS => '3', TTL => '21474836
47', BLOCKSIZE => '65536', IN_MEMORY => 'false', BL
OCKCACHE => 'true'}, {NAME => 'year', BLOOMFILTER =
> 'NONE', REPLICATION_SCOPE => '0', COMPRESSION =>
'NONE', VERSIONS => '3', TTL => '2147483647', BLOCK
SIZE => '65536', IN_MEMORY => 'false', BLOCKCACHE =
> 'true'}]}
1 row(s) in 0.1820 seconds
hbase(main):006:0> scan 'movies'
ROW COLUMN+CELL
1 column=name:, timestamp=1308044917482, value=new
1 column=year:, timestamp=1308044926957, value=2055
1 row(s) in 0.0710 seconds
Command line:hadoop@hadoop:~/work/hadoop/hadoop-0.20.203.0$ bin/hadoop
jar ../../hbase/hbase-0.90.3/hbase-0.90.3.jar importtsv
-Dimporttscolumns=HBASE_ROW_KEY,year,name movies
/user/hadoop/movies/movie.csv -Dimporttsv.separator=',' 2>log
Output on stderr:
..some lines ommitted..
11/06/14 15:35:21 INFO zookeeper.ZooKeeper: Client
environment:java.library.path=/home/hadoop/work/hadoop/hadoop-0.20.203.0/bin/../lib/native/Linux-i386-32
11/06/14 15:35:21 INFO zookeeper.ZooKeeper: Client
environment:java.io.tmpdir=/tmp
11/06/14 15:35:21 INFO zookeeper.ZooKeeper: Client
environment:java.compiler=<NA>
11/06/14 15:35:21 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
11/06/14 15:35:21 INFO zookeeper.ZooKeeper: Client environment:os.arch=i386
11/06/14 15:35:21 INFO zookeeper.ZooKeeper: Client
environment:os.version=2.6.35-25-generic
11/06/14 15:35:21 INFO zookeeper.ZooKeeper: Client
environment:user.name=hadoop
11/06/14 15:35:21 INFO zookeeper.ZooKeeper: Client
environment:user.home=/home/hadoop
11/06/14 15:35:21 INFO zookeeper.ZooKeeper: Client
environment:user.dir=/home/hadoop/work/hadoop/hadoop-0.20.203.0
11/06/14 15:35:21 INFO zookeeper.ZooKeeper: Initiating client
connection, connectString=localhost:2181 sessionTimeout=180000
watcher=hconnection
11/06/14 15:35:21 INFO zookeeper.ClientCnxn: Opening socket connection
to server localhost/0:0:0:0:0:0:0:1:2181
11/06/14 15:35:21 INFO zookeeper.ClientCnxn: Socket connection
established to localhost/0:0:0:0:0:0:0:1:2181, initiating session
11/06/14 15:35:21 INFO zookeeper.ClientCnxn: Session establishment
complete on server localhost/0:0:0:0:0:0:0:1:2181, sessionid =
0x1308d8861600014, negotiated timeout = 180000
11/06/14 15:35:22 INFO mapreduce.TableOutputFormat: Created table
instance for movies
11/06/14 15:35:22 INFO input.FileInputFormat: Total input paths to process : 1
11/06/14 15:35:22 INFO mapred.JobClient: Running job: job_201106141233_0042
11/06/14 15:35:23 INFO mapred.JobClient: map 0% reduce 0%
11/06/14 15:38:16 INFO mapred.JobClient: Task Id :
attempt_201106141233_0042_m_000000_0, Status : FAILED
java.lang.NullPointerException
at
org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.close(TableOutputFormat.java:107)
at
org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:650)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:765)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:369)
at org.apache.hadoop.mapred.Child$4.run(Child.java:259)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
at org.apache.hadoop.mapred.Child.main(Child.java:253)
attempt_201106141233_0042_m_000000_0: Bad line at offset: 0:
attempt_201106141233_0042_m_000000_0: No delimiter
attempt_201106141233_0042_m_000000_0: Bad line at offset: 34:
attempt_201106141233_0042_m_000000_0: No delimiter
attempt_201106141233_0042_m_000000_0: Bad line at offset: 51:
.......................... x33123 lines
----------------------------------------------------------------
This message was sent using IMP, the Internet Messaging Program.