hi rxin,
Will Spark sql Support Authorization not only DDL ?
In my user case ,a hive table was granted read to userA and other
user don't have permission to read , but userB can read this hive table using
spark sql.
Ricky Ou
hi , all
when migrating hive sql to spark sql encountor a incompatibility
problem . Please give me some suggestions.
hive table description and data format as following :
1
use spark;
drop table spark.test_or1;
CREATE TABLE `spark.test_or1`(
`statis_date` string,
`lbl_nm` string)
hi all;
--driver-java-options not support multiple JVM configuration.
the submot as following:
Cores=16
sparkdriverextraJavaOptions="-XX:newsize=2096m -XX:MaxPermSize=512m
-XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+UseP
arNewGC -XX:+UseConcMarkSweepGC
imeLimit=5,-XX:GCHeapFreeLimit=95'
From: Marcelo Vanzin
Date: 2016-01-21 12:09
To: our...@cnsuning.com
CC: user
Subject: Re: --driver-java-options not support multiple JVM configuration ?
On Wed, Jan 20, 2016 at 7:38 PM, our...@cnsuning.com
<our...@cnsuning.com> wrote:
> --driver-java-options $sparkdri
So sorry , should be Seq, not sql . thanks for your help.
Ricky Ou(欧 锐)
From: Dean Wampler
Date: 2015-12-23 00:46
To: our...@cnsuning.com
CC: user; t...@databricks.com
Subject: Re: spark streaming updateStateByKey state is nonsupport other type
except ClassTag such as list
.apache.spark.rdd.RDD[(String, Seq[Int])])
val stateDstream = wordDstream.updateStateByKey[Seq[Int]](newUpdateFunc,
Ricky Ou(欧 锐)
部 门:苏宁云商 IT总部技术支撑研发中心大
数据中心数据平台开发部
tel :18551600418
email : our...@cnsuning.com
From: Dean Wampler
Date: 2015-12-23 00:46
To: ou
of
the words)
// wordDstream.updateStateByKey(newUpdateFunc, new HashPartitioner
(ssc.sparkContext.defaultParallelism),true,initialRDD)
val stateDstream = wordDstream.updateStateByKey[Seq[Int]](newUpdateFunc, new
HashPartitioner (ssc.sparkContext.defaultParallelism),true, initialRDD)
stateDstream.print()
ssc.start()
ssc.
spark streaming updateStateByKey state no support Array type without classTag?
how to slove the problem?
def updateStateByKey[S: ClassTag](
updateFunc: (Seq[V], Option[S]) => Option[S]
): DStream[(K, S)] = ssc.withScope {
updateStateByKey(updateFunc, defaultPartitioner())
}
ClassTag not
hi all,
throw java.lang.ArrayIndexOutOfBoundsException when I use following
spark sql on spark standlone or yarn.
the sql:
select ta.*
from bi_td.dm_price_seg_td tb
join bi_sor.sor_ord_detail_tf ta
on 1 = 1
where ta.sale_dt = '20140514'
and ta.sale_price >= tb.pri_from
Akhil,
In locally ,all nodes will has the same jar because the driver will be
assgined to random node ;otherwise the driver log wiil report :no jar was
founded .
Ricky Ou(欧 锐)
From: Akhil Das
Date: 2015-11-02 17:59
To: our...@cnsuning.com
CC: user; 494165115
hi all,
when using command:
spark-submit --deploy-mode cluster --jars hdfs:///user/spark/cypher.jar
--class com.suning.spark.jdbc.MysqlJdbcTest hdfs:///user/spark/MysqlJdbcTest.jar
the program throw exception that cannot find class in cypher.jar, the driver
log show no --jars
hi all,
when using JavaSparkSQL example,the code was submit many times as following:
/home/spark/software/spark/bin/spark-submit --deploy-mode cluster --class
org.apache.spark.examples.sql.JavaSparkSQL
hdfs://SuningHadoop2/user/spark/lib/spark-examples-1.4.0-hadoop2.4.0.jar
unfortunately ,
https://issues.apache.org/jira/browse/SPARK-10832
发件人: our...@cnsuning.com
发送时间: 2015-09-25 20:36
收件人: user
抄送: 494165115
主题: sometimes No event logs found for application using same JavaSparkSQL
example
hi all,
when using JavaSparkSQL example,the code was submit many times
)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
From: Terry Hole
Date: 2015-08-28 17:22
To: our...@cnsuning.com
CC: user; hao.cheng; Huang, Jie
Subject: Re: Job aborted due to stage failure:
java.lang.StringIndexOutOfBoundsException: String index out of range: 18
Ricky,
You may need
)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1411)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
Ricky Ou(欧 锐)
部 门:苏宁云商 IT总部技术支撑研发中心大
数据中心数据平台开发部
email : our...@cnsuning.com
15 matches
Mail list logo