forgot to copy user list

On Sat, Oct 4, 2014 at 3:12 PM, Nick Pentreath <nick.pentre...@gmail.com>
wrote:

> what version did you put in the pom.xml?
>
> it does seem to be in Maven central:
> http://search.maven.org/#artifactdetails%7Corg.apache.hbase%7Chbase%7C0.98.6-hadoop2%7Cpom
>
> <dependency>
>     <groupId>org.apache.hbase</groupId>
>     <artifactId>hbase</artifactId>
>     <version>0.98.6-hadoop2</version>
> </dependency>
>
> Note you shouldn't need to rebuild Spark, I think just the example project
> via sbt examples/assembly
>
> On Fri, Oct 3, 2014 at 10:55 AM, serkan.dogan <foreignerdr...@yahoo.com>
> wrote:
>
>> Hi,
>> I installed hbase-0.98.6-hadoop2. It's working not any problem with that.
>>
>>
>> When i am try to run spark hbase  python examples, (wordcount examples
>> working - not python issue)
>>
>>  ./bin/spark-submit  --master local --driver-class-path
>> ./examples/target/spark-examples_2.10-1.1.0.jar
>> ./examples/src/main/python/hbase_inputformat.py localhost myhbasetable
>>
>> the process exit with ClassNotFoundException...
>>
>> I search lots of blogs, sites all says "spark 1.1 version built with hbase
>> 0.94.6 rebuild with own hbase version".
>>
>> I try first,
>> change hbase version number - in pom.xml  -- nothing found maven central
>>
>> I try second,
>> compile hbase from src and copy hbase/lib folder hbase jars to
>> spark/lib_managed folder and edit spark-defaults.conf
>>
>> my spark-defaults.conf
>>
>> spark.executor.extraClassPath
>>
>> /home/downloads/spark/spark-1.1.0/lib_managed/jars/hbase-server-0.98.6-hadoop2.jar:/home/downloads/spark/spark-1.1.0/lib_managed/jars/hbase-protocol-0.98.6-hadoop2.jar:/home/downloads/spark/spark-1.1.0/lib_managed/jars/hbase-hadoop2-compat-0.98.6-hadoop2.jar:/home/downloads/spark/spark-1.1.0/lib_managed/jars/hbase-client-0.98.6-hadoop2.jar:/home/downloads/spark/spark-1.1.0/lib_managed/jars/hbase-commont-0.98.6-hadoop2.jar:/home/downloads/spark/spark-1.1.0/lib_managed/jars/htrace-core-2.04.jar
>>
>>
>> My question is how i can work with hbase 0.98.6-hadoop2 with spark 1.1.0
>>
>> Here is the exception message....
>>
>>
>> Using Spark's default log4j profile:
>> org/apache/spark/log4j-defaults.properties
>> 14/10/03 11:27:15 WARN Utils: Your hostname, xxx.yyy.com resolves to a
>> loopback address: 127.0.0.1; using 1.1.1.1 instead (on interface eth0)
>> 14/10/03 11:27:15 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
>> another address
>> 14/10/03 11:27:15 INFO SecurityManager: Changing view acls to: root,
>> 14/10/03 11:27:15 INFO SecurityManager: Changing modify acls to: root,
>> 14/10/03 11:27:15 INFO SecurityManager: SecurityManager: authentication
>> disabled; ui acls disabled; users with view permissions: Set(root, );
>> users
>> with modify permissions: Set(root, )
>> 14/10/03 11:27:16 INFO Slf4jLogger: Slf4jLogger started
>> 14/10/03 11:27:16 INFO Remoting: Starting remoting
>> 14/10/03 11:27:16 INFO Remoting: Remoting started; listening on addresses
>> :[akka.tcp://sparkdri...@1-1-1-1-1.rev.mydomain.io:49256]
>> 14/10/03 11:27:16 INFO Remoting: Remoting now listens on addresses:
>> [akka.tcp://sparkdri...@1-1-1-1-1.rev.mydomain.io:49256]
>> 14/10/03 11:27:16 INFO Utils: Successfully started service 'sparkDriver'
>> on
>> port 49256.
>> 14/10/03 11:27:16 INFO SparkEnv: Registering MapOutputTracker
>> 14/10/03 11:27:16 INFO SparkEnv: Registering BlockManagerMaster
>> 14/10/03 11:27:16 INFO DiskBlockManager: Created local directory at
>> /tmp/spark-local-20141003112716-298d
>> 14/10/03 11:27:16 INFO Utils: Successfully started service 'Connection
>> manager for block manager' on port 35106.
>> 14/10/03 11:27:16 INFO ConnectionManager: Bound socket to port 35106 with
>> id
>> = ConnectionManagerId(1-1-1-1-1.rev.mydomain.io,35106)
>> 14/10/03 11:27:16 INFO MemoryStore: MemoryStore started with capacity
>> 267.3
>> MB
>> 14/10/03 11:27:16 INFO BlockManagerMaster: Trying to register BlockManager
>> 14/10/03 11:27:16 INFO BlockManagerMasterActor: Registering block manager
>> 1-1-1-1-1.rev.mydomain.io:35106 with 267.3 MB RAM
>> 14/10/03 11:27:16 INFO BlockManagerMaster: Registered BlockManager
>> 14/10/03 11:27:16 INFO HttpFileServer: HTTP File server directory is
>> /tmp/spark-f60b0533-998f-4af2-a208-d04c571eab82
>> 14/10/03 11:27:16 INFO HttpServer: Starting HTTP Server
>> 14/10/03 11:27:16 INFO Utils: Successfully started service 'HTTP file
>> server' on port 49611.
>> 14/10/03 11:27:16 INFO Utils: Successfully started service 'SparkUI' on
>> port
>> 4040.
>> 14/10/03 11:27:16 INFO SparkUI: Started SparkUI at
>> http://1-1-1-1-1.rev.mydomain.io:4040
>> 14/10/03 11:27:16 INFO Utils: Copying
>>
>> /home/downloads/spark/spark-1.1.0/./examples/src/main/python/hbase_inputformat.py
>> to /tmp/spark-7232227a-0547-454e-9f68-805fa7b0c2f0/hbase_inputformat.py
>> 14/10/03 11:27:16 INFO SparkContext: Added file
>>
>> file:/home/downloads/spark/spark-1.1.0/./examples/src/main/python/hbase_inputformat.py
>> at http://1.1.1.1:49611/files/hbase_inputformat.py with timestamp
>> 1412324836837
>> 14/10/03 11:27:16 INFO AkkaUtils: Connecting to HeartbeatReceiver:
>> akka.tcp://
>> sparkdri...@1-1-1-1-1.rev.mydomain.io:49256/user/HeartbeatReceiver
>> Traceback (most recent call last):
>>   File
>>
>> "/home/downloads/spark/spark-1.1.0/./examples/src/main/python/hbase_inputformat.py",
>> line 70, in <module>
>>     conf=conf)
>>   File "/home/downloads/spark/spark-1.1.0/python/pyspark/context.py", line
>> 471, in newAPIHadoopRDD
>>     jconf, batchSize)
>>   File
>>
>> "/usr/lib/python2.6/site-packages/py4j-0.8.2.1-py2.6.egg/py4j/java_gateway.py",
>> line 538, in __call__
>>     self.target_id, self.name)
>>   File
>>
>> "/usr/lib/python2.6/site-packages/py4j-0.8.2.1-py2.6.egg/py4j/protocol.py",
>> line 300, in get_return_value
>>     format(target_id, '.', name), value)
>> py4j.protocol.Py4JJavaError: An error occurred while calling
>> z:org.apache.spark.api.python.PythonRDD.newAPIHadoopRDD.
>> : java.lang.ClassNotFoundException:
>> org.apache.hadoop.hbase.io.ImmutableBytesWritable
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:323)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:268)
>>         at java.lang.Class.forName0(Native Method)
>>         at java.lang.Class.forName(Class.java:270)
>>         at org.apache.spark.util.Utils$.classForName(Utils.scala:150)
>>         at
>>
>> org.apache.spark.api.python.PythonRDD$.newAPIHadoopRDDFromClassNames(PythonRDD.scala:451)
>>         at
>>
>> org.apache.spark.api.python.PythonRDD$.newAPIHadoopRDD(PythonRDD.scala:436)
>>         at
>> org.apache.spark.api.python.PythonRDD.newAPIHadoopRDD(PythonRDD.scala)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>         at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:622)
>>         at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
>>         at
>> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
>>         at py4j.Gateway.invoke(Gateway.java:259)
>>         at
>> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
>>         at py4j.commands.CallCommand.execute(CallCommand.java:79)
>>         at py4j.GatewayConnection.run(GatewayConnection.java:207)
>>         at java.lang.Thread.run(Thread.java:701)
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-1-0-hbase-0-98-6-hadoop2-version-py4j-protocol-Py4JJavaError-java-lang-ClassNotFoundException-tp15668.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to