RE: Has anyone managed to connect to Oracle via JDBC from Spark CDH 5.5.2

2016-12-22 Thread Alexander Kapustin
Hello,

For Spark 1.5 (and for 1.6) we use Oracle jdbc via spark-submit.sh --jars 
/path/to/ojdbc6.jar
Also we use additional oracle driver properties via --driver-java-options

Sent from Mail for Windows 10

From: Mich Talebzadeh
Sent: 22 декабря 2016 г. 0:40
To: user @spark
Subject: Has anyone managed to connect to Oracle via JDBC from Spark CDH 5.5.2

This works with Spark 2 with Oracle jar file added to


$SPARK_HOME/conf/ spark-defaults.conf





spark.driver.extraClassPath  /home/hduser/jars/ojdbc6.jar

spark.executor.extraClassPath/home/hduser/jars/ojdbc6.jar



and you get

 cala> val s = HiveContext.read.format("jdbc").options(
 | Map("url" -> _ORACLEserver,
 | "dbtable" -> "(SELECT to_char(ID) AS ID, to_char(CLUSTERED) AS 
CLUSTERED, to_char(SCATTERED) AS SCATTERED, to_char(RANDOMISED) AS RANDOMISED, 
RANDOM_STRING, SMALL_VC, PADDING FROM scratchpad.dummy)",
 | "partitionColumn" -> "ID",
 | "lowerBound" -> "1",
 | "upperBound" -> "1",
 | "numPartitions" -> "10",
 | "user" -> _username,
 | "password" -> _password)).load
s: org.apache.spark.sql.DataFrame = [ID: string, CLUSTERED: string ... 5 more 
fields]

that works.
However, with CDH 5.5.2 (Spark 1.5) it fails with error


java.sql.SQLException: No suitable driver

  at java.sql.DriverManager.getDriver(DriverManager.java:315)

  at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$2.apply(JdbcUtils.scala:54)

  at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$2.apply(JdbcUtils.scala:54)

  at scala.Option.getOrElse(Option.scala:121)

  at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.createConnectionFactory(JdbcUtils.scala:53)

  at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:123)

  at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.(JDBCRelation.scala:117)

  at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:53)

  at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:315)

  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:149)

  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122)


Any ideas?

Thanks





Dr Mich Talebzadeh



LinkedIn  
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw



http://talebzadehmich.wordpress.com


Disclaimer: Use it at your own risk. Any and all responsibility for any loss, 
damage or destruction of data or any other property which may arise from 
relying on this email's technical content is explicitly disclaimed. The author 
will in no case be liable for any monetary damages arising from such loss, 
damage or destruction.




RE: spark job automatically killed without rhyme or reason

2016-06-17 Thread Alexander Kapustin
tStream.read(DFSInputStream.java:903)  
  at java.io.DataInputStream.readFully(DataInputStream.java:195)at 
org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.readStripeFooter(RecordReaderImpl.java:2265)
at 
org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.readStripe(RecordReaderImpl.java:2635)..

Thanks in advance!




    On Friday, June 17, 2016 3:52 PM, Alexander Kapustin <kp...@hotmail.com> 
wrote:


 #yiv7679307012 -- filtered {panose-1:2 4 5 3 5 4 6 3 2 4;}#yiv7679307012 
filtered {font-family:Calibri;panose-1:2 15 5 2 2 2 4 3 2 4;}#yiv7679307012 
p.yiv7679307012MsoNormal, #yiv7679307012 li.yiv7679307012MsoNormal, 
#yiv7679307012 div.yiv7679307012MsoNormal 
{margin:0cm;margin-bottom:.0001pt;font-size:11.0pt;}#yiv7679307012 a:link, 
#yiv7679307012 span.yiv7679307012MsoHyperlink 
{color:blue;text-decoration:underline;}#yiv7679307012 a:visited, #yiv7679307012 
span.yiv7679307012MsoHyperlinkFollowed 
{color:#954F72;text-decoration:underline;}#yiv7679307012 
.yiv7679307012MsoChpDefault {}#yiv7679307012 filtered {margin:2.0cm 42.5pt 
2.0cm 3.0cm;}#yiv7679307012 div.yiv7679307012WordSection1 {}#yiv7679307012 Hi,  
 Did you submit spark job via YARN? In some cases (memory configuration 
probably), yarn can kill containers where spark tasks are executed. In this 
situation, please check yarn userlogs for more information…--WBR, Alexander 
  From: Zhiliang Zhu
Sent: 17 июня 2016 г. 9:36
To: Zhiliang Zhu; User
Subject: Re: spark job automatically killed without rhyme or reason   anyone 
ever met the similar problem, which is quite strange ...

On Friday, June 17, 2016 2:13 PM, Zhiliang Zhu <zchl.j...@yahoo.com.INVALID> 
wrote:


Hi All,
I have a big job which mainly takes more than one hour to run the whole, 
however, it is very much unreasonable to exit & finish to run midway (almost 
80% of the job finished actually, but not all), without any apparent error or 
exception log.
I submitted the same job for many times, it is same as that.In the last line of 
the run log, just one word "killed" to end, or sometimes not any  other wrong 
log, all seems okay but should not finish.
What is the way for the problem? Is there any other friends that ever met the 
similar issue ...
Thanks in advance!










RE: spark job automatically killed without rhyme or reason

2016-06-17 Thread Alexander Kapustin
Hi,

Did you submit spark job via YARN? In some cases (memory configuration 
probably), yarn can kill containers where spark tasks are executed. In this 
situation, please check yarn userlogs for more information…

--
WBR, Alexander

From: Zhiliang Zhu
Sent: 17 июня 2016 г. 9:36
To: Zhiliang Zhu; User
Subject: Re: spark job automatically killed without rhyme or reason

anyone ever met the similar problem, which is quite strange ...

On Friday, June 17, 2016 2:13 PM, Zhiliang Zhu 
 wrote:


 Hi All,
I have a big job which mainly takes more than one hour to run the whole, 
however, it is very much unreasonable to exit & finish to run midway (almost 
80% of the job finished actually, but not all), without any apparent error or 
exception log.
I submitted the same job for many times, it is same as that.In the last line of 
the run log, just one word "killed" to end, or sometimes not any  other wrong 
log, all seems okay but should not finish.
What is the way for the problem? Is there any other friends that ever met the 
similar issue ...
Thanks in advance!