[
https://issues.apache.org/jira/browse/HBASE-13992?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14639290#comment-14639290
]
Sean Busbey commented on HBASE-13992:
-------------------------------------
I'm trying to do a final build to make sure I've got the patch applied
correctly, and the new integration tests are failing.
{code}
[INFO] --- scalatest-maven-plugin:1.0:test (integration-test) @ hbase-spark ---
WARNING: -c has been deprecated and will be reused for a different (but still
very cool) purpose in ScalaTest 2.0. Please change all uses of -c to -P.
Discovery starting.
Discovery completed in 284 milliseconds.
Run starting. Expected test count is: 11
HBaseDStreamFunctionsSuite:
HBaseContextSuite:
HBaseRDDFunctionsSuite:
2015-07-23 10:50:56.702 java[97585:6403] Unable to load realm info from
SCDynamicStore
*** RUN ABORTED ***
java.util.concurrent.ExecutionException: java.io.IOException: Shutting down
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:188)
at
org.scalatest.tools.ConcurrentDistributor.waitUntilDone(ConcurrentDistributor.scala:52)
at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:2549)
at
org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1044)
at
org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1043)
at
org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:2722)
at
org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1043)
at org.scalatest.tools.Runner$.main(Runner.scala:860)
at org.scalatest.tools.Runner.main(Runner.scala)
...
Cause: java.io.IOException: Shutting down
at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:232)
at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:94)
at
org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1040)
at
org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1006)
at
org.apache.hadoop.hbase.spark.HBaseDStreamFunctionsSuite.beforeAll(HBaseDStreamFunctionsSuite.scala:44)
at
org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
at
org.apache.hadoop.hbase.spark.HBaseDStreamFunctionsSuite.beforeAll(HBaseDStreamFunctionsSuite.scala:30)
at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
at
org.apache.hadoop.hbase.spark.HBaseDStreamFunctionsSuite.run(HBaseDStreamFunctionsSuite.scala:30)
at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)
...
Cause: java.lang.RuntimeException: Failed construction of Master: class
org.apache.hadoop.hbase.master.HMasterAddress already in use
at
org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:143)
at
org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:218)
at
org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:154)
at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:214)
at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:94)
at
org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1040)
at
org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1006)
at
org.apache.hadoop.hbase.spark.HBaseDStreamFunctionsSuite.beforeAll(HBaseDStreamFunctionsSuite.scala:44)
at
org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
at
org.apache.hadoop.hbase.spark.HBaseDStreamFunctionsSuite.beforeAll(HBaseDStreamFunctionsSuite.scala:30)
...
Cause: java.net.BindException: Port in use: 0.0.0.0:16010
at org.apache.hadoop.hbase.http.HttpServer.openListeners(HttpServer.java:1013)
at org.apache.hadoop.hbase.http.HttpServer.start(HttpServer.java:949)
at org.apache.hadoop.hbase.http.InfoServer.start(InfoServer.java:91)
at
org.apache.hadoop.hbase.regionserver.HRegionServer.putUpWebUI(HRegionServer.java:1789)
at
org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:604)
at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:363)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
...
Cause: java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:444)
at sun.nio.ch.Net.bind(Net.java:436)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at
org.mortbay.jetty.nio.SelectChannelConnector.open(SelectChannelConnector.java:216)
at org.apache.hadoop.hbase.http.HttpServer.openListeners(HttpServer.java:1008)
at org.apache.hadoop.hbase.http.HttpServer.start(HttpServer.java:949)
at org.apache.hadoop.hbase.http.InfoServer.start(InfoServer.java:91)
at
org.apache.hadoop.hbase.regionserver.HRegionServer.putUpWebUI(HRegionServer.java:1789)
...
{code}
I'm running {{mvn -DskipTests install}} at the top level and then {{mvn
-Dtest=NoUnitTests clean verify}} in the spark module.
> Integrate SparkOnHBase into HBase
> ---------------------------------
>
> Key: HBASE-13992
> URL: https://issues.apache.org/jira/browse/HBASE-13992
> Project: HBase
> Issue Type: New Feature
> Components: spark
> Reporter: Ted Malaska
> Assignee: Ted Malaska
> Fix For: 2.0.0
>
> Attachments: HBASE-13992.5.patch, HBASE-13992.6.patch,
> HBASE-13992.7.patch, HBASE-13992.8.patch, HBASE-13992.9.patch,
> HBASE-13992.patch, HBASE-13992.patch.3, HBASE-13992.patch.4,
> HBASE-13992.patch.5
>
>
> This Jira is to ask if SparkOnHBase can find a home in side HBase core.
> Here is the github:
> https://github.com/cloudera-labs/SparkOnHBase
> I am the core author of this project and the license is Apache 2.0
> A blog explaining this project is here
> http://blog.cloudera.com/blog/2014/12/new-in-cloudera-labs-sparkonhbase/
> A spark Streaming example is here
> http://blog.cloudera.com/blog/2014/11/how-to-do-near-real-time-sessionization-with-spark-streaming-and-apache-hadoop/
> A real customer using this in produce is blogged here
> http://blog.cloudera.com/blog/2015/03/how-edmunds-com-used-spark-streaming-to-build-a-near-real-time-dashboard/
> Please debate and let me know what I can do to make this happen.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)