I have wrote some very simple Scala codes followed,these codes make a Spark
job,and write something into Couchbase. But when I run this job on Spark,I
got an exception.I am wondering how this happend. 
Something must be pointed, if I don't use Spark , the code can work. 



Some other tips: 
   1.the class :org.apache.http.protocol.RequestUserAgent is in package
httpcore-4.3,this package is an dependency of couchbase sdk,and this class
has the constructor function  with a String param. 
   2.two dependicies 
    "com.couchbase.client" % "couchbase-client" % "1.4.4", 
    "org.apache.spark" %% "spark-core" % "1.0.1" % "provided" 
   3.I am using sbt assembly 

CODE LIST:

import java.net.URI 
import com.couchbase.client.CouchbaseClient 
import scala.collection.JavaConversions._ 
import scala.collection.mutable.ArrayBuffer 
import org.apache.spark.SparkContext 
import org.apache.spark.SparkContext._ 
import org.apache.spark.SparkConf 
  
object Run { 
    def main(args: Array[String]){ 
        val conf = new SparkConf(true) 
        val sc = new SparkContext("local", "test", conf) 
        var nodes = ArrayBuffer(URI.create("http://127.0.0.1:8091/pools";)) 
        var client = new CouchbaseClient(nodes, "Result", "") 
        client.set("hello","hello, world") 
        var result = client.get("hello") 
        println(result) 
        client.shutdown() 
        System.exit(0) 
    } 
} 

Exception:

14/08/09 10:19:32 WARN Utils: Your hostname, precise64 resolves to a
loopback address: 127.0.1.1; using 192.168.33.10 instead (on interface eth1) 
14/08/09 10:19:32 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
another address 
14/08/09 10:19:38 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable 
2014-08-09 10:19:40.569 INFO net.spy.memcached.auth.AuthThread: 
Authenticated to /127.0.0.1:11210 
2014-08-09 10:19:40.677 INFO
com.couchbase.client.vbucket.provider.BucketConfigurationProvider:  Could
bootstrap through carrier publication. 
2014-08-09 10:19:40.693 INFO com.couchbase.client.CouchbaseConnection: 
Added {QA sa=localhost/127.0.0.1:11210, #Rops=0, #Wops=0, #iq=0,
topRop=null, topWop=null, toWrite=0, interested=0} to connect queue 
2014-08-09 10:19:40.702 INFO com.couchbase.client.CouchbaseClient: 
CouchbaseConnectionFactory{bucket='Result',
nodes=[http://127.0.0.1:8091/pools], order=RANDOM, opTimeout=2500,
opQueue=16384, opQueueBlockTime=10000, obsPollInt=10, obsPollMax=500,
obsTimeout=5000, viewConns=10, viewTimeout=75000, viewWorkers=1,
configCheck=10, reconnectInt=1100, failureMode=Redistribute,
hashAlgo=NATIVE_HASH, authWaitTime=2500} 
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.http.protocol.RequestUserAgent.<init>(Ljava/lang/String;)V 
        at
com.couchbase.client.ViewConnection.<init>(ViewConnection.java:156) 
        at
com.couchbase.client.CouchbaseConnectionFactory.createViewConnection(CouchbaseConnectionFactory.java:254)
 
        at
com.couchbase.client.CouchbaseClient.<init>(CouchbaseClient.java:266) 
        at
com.couchbase.client.CouchbaseClient.<init>(CouchbaseClient.java:194) 
        at com.eyespage.dc.etl.c2s.Run$.main(main.scala:16) 
        at com.eyespage.dc.etl.c2s.Run.main(main.scala) 
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 
        at java.lang.reflect.Method.invoke(Method.java:606) 
        at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303) 
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55) 
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
2014-08-09 10:19:40.836 INFO net.spy.memcached.auth.AuthThread: 
Authenticated to localhost/127.0.0.1:11210



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Exception-when-call-couchbase-sdk-in-Spark-Job-tp11872.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to