Re: Unsubscribe
Le ven. 15 juin 2018 à 23:17, Mikhail Dubkov a écrit : > Unsubscribe > > On Thu, Jun 14, 2018 at 8:38 PM Kumar S, Sajive > wrote: > >> Unsubscribe >> >
Re: Re: Support SqlStreaming in spark
Unsuscribe 2018-06-15 9:20 GMT+02:00 stc : > The repo you give may solve some of SqlStreaming problems, but not > friendly enough, user need to learn this new syntax. > > -- > Jacky Lee > Mail:qcsd2...@163.com > > At 2018-06-15 11:48:01, "Bowden, Chris" > wrote: > > Not sure if there is a question in here, but if you are hinting that > structured streaming should support a sql interface, spark has appropriate > extensibility hooks to make it possible. However, the most powerful > construct in structured streaming is quite difficult to find a sql > equivalent for (e.g., flatMapGroupsWithState). This repo could use some > cleanup but is an example of providing a sql interface to a subset of > structured streaming's functionality: https://github. > com/vertica/pstl/blob/master/pstl/src/main/antlr4/org/ > apache/spark/sql/catalyst/parser/pstl/PstlSqlBase.g4. > > -- > *From:* JackyLee > *Sent:* Thursday, June 14, 2018 7:06:17 PM > *To:* dev@spark.apache.org > *Subject:* Support SqlStreaming in spark > > Hello > > Nowadays, more and more streaming products begin to support SQL streaming, > such as KafaSQL, Flink SQL and Storm SQL. To support SQL Streaming can not > only reduce the threshold of streaming, but also make streaming easier to > be > accepted by everyone. > > At present, StructStreaming is relatively mature, and the StructStreaming > is > based on DataSet API, which make it possibal to provide a SQL portal for > structstreaming and run structstreaming in SQL. > > To support for SQL Streaming, there are two key points: > 1, Analysis should be able to parse streaming type SQL. > 2, Analyzer should be able to map metadata information to the corresponding > Relation. > > Running StructStreaming in SQL can bring some benefits. > 1, Reduce the entry threshold of StructStreaming and attract users more > easily. > 2, Encapsulate the meta information of source or sink into table, maintain > and manage uniformly, and make users more accessible. > 3. Metadata permissions management, which is based on hive, can control > StructStreaming's overall authority management scheme more closely. > > We have found some ways to solve this problem. It's a pleasure to discuss > it > with you. > > Thanks, > > Jackey Lee > > > > -- > Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/ > > - > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > >
Re: [ANNOUNCE] Announcing Apache Spark 2.3.1
Unsubscribe et Le jeu. 14 juin 2018 à 20:59, Jules Damji a écrit : > > > Matei & I own it. I normally tweet or handle Spark related PSAs > > Cheers > Jules > > Sent from my iPhone > Pardon the dumb thumb typos :) > > > On Jun 14, 2018, at 11:45 AM, Marcelo Vanzin > wrote: > > > > Hi Jacek, > > > > I seriously have no idea... I don't even know who owns that account (I > > hope they have some connection with the PMC?). > > > > But it seems whoever owns it already sent something. > > > >> On Thu, Jun 14, 2018 at 12:31 AM, Jacek Laskowski > wrote: > >> Hi Marcelo, > >> > >> How to announce it on twitter @ https://twitter.com/apachespark? How > to make > >> it part of the release process? > >> > >> Pozdrawiam, > >> Jacek Laskowski > >> > >> https://about.me/JacekLaskowski > >> Mastering Spark SQL https://bit.ly/mastering-spark-sql > >> Spark Structured Streaming https://bit.ly/spark-structured-streaming > >> Mastering Kafka Streams https://bit.ly/mastering-kafka-streams > >> Follow me at https://twitter.com/jaceklaskowski > >> > >>> On Mon, Jun 11, 2018 at 9:47 PM, Marcelo Vanzin > wrote: > >>> > >>> We are happy to announce the availability of Spark 2.3.1! > >>> > >>> Apache Spark 2.3.1 is a maintenance release, based on the branch-2.3 > >>> maintenance branch of Spark. We strongly recommend all 2.3.x users to > >>> upgrade to this stable release. > >>> > >>> To download Spark 2.3.1, head over to the download page: > >>> http://spark.apache.org/downloads.html > >>> > >>> To view the release notes: > >>> https://spark.apache.org/releases/spark-release-2-3-1.html > >>> > >>> We would like to acknowledge all community members for contributing to > >>> this release. This release would not have been possible without you. > >>> > >>> > >>> -- > >>> Marcelo > >>> > >>> - > >>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org > > > > > > > > -- > > Marcelo > > > > - > > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > > > > > - > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > >
unsubscribe
unsubscribe
Re: SparkR was removed from CRAN on 2018-05-01
unsubscribe Le ven. 25 mai 2018 à 23:11, Felix Cheunga écrit : > This is the fix > > https://github.com/apache/spark/commit/f27a035daf705766d3445e5c6a99867c11c552b0#diff-e1e1d3d40573127e9ee0480caf1283d6 > > I don’t have the email though. > > -- > *From:* Hossein > *Sent:* Friday, May 25, 2018 10:58:42 AM > *To:* dev@spark.apache.org > *Subject:* SparkR was removed from CRAN on 2018-05-01 > > Would you please forward the email from CRAN? Is there a JIRA? > > Thanks, > --Hossein >
Re: 回复: Welcome Zhenhua Wang as a Spark committer
Congrats Le lun. 2 avr. 2018 à 12:06, Weichen Xua écrit : > Congrats Zhenhua! > > On Mon, Apr 2, 2018 at 5:32 PM, Gengliang wrote: > >> Congrats, Zhenhua! >> >> >> >> On Mon, Apr 2, 2018 at 5:19 PM, Marco Gaido >> wrote: >> >>> Congrats Zhenhua! >>> >>> 2018-04-02 11:00 GMT+02:00 Saisai Shao : >>> Congrats, Zhenhua! 2018-04-02 16:57 GMT+08:00 Takeshi Yamamuro : > Congrats, Zhenhua! > > On Mon, Apr 2, 2018 at 4:13 PM, Ted Yu wrote: > >> Congratulations, Zhenhua >> >> Original message >> From: 雨中漫步 <601450...@qq.com> >> Date: 4/1/18 11:30 PM (GMT-08:00) >> To: Yuanjian Li , Wenchen Fan < >> cloud0...@gmail.com> >> Cc: dev >> Subject: 回复: Welcome Zhenhua Wang as a Spark committer >> >> Congratulations Zhenhua Wang >> >> >> -- 原始邮件 -- >> *发件人:* "Yuanjian Li" ; >> *发送时间:* 2018年4月2日(星期一) 下午2:26 >> *收件人:* "Wenchen Fan" ; >> *抄送:* "Spark dev list" ; >> *主题:* Re: Welcome Zhenhua Wang as a Spark committer >> >> Congratulations Zhenhua!! >> >> 2018-04-02 13:28 GMT+08:00 Wenchen Fan : >> >>> Hi all, >>> >>> The Spark PMC recently added Zhenhua Wang as a committer on the >>> project. Zhenhua is the major contributor of the CBO project, and has >>> been >>> contributing across several areas of Spark for a while, focusing >>> especially >>> on analyzer, optimizer in Spark SQL. Please join me in welcoming >>> Zhenhua! >>> >>> Wenchen >>> >> >> > > > -- > --- > Takeshi Yamamuro > >>> >> >
[MLlib] QuantRegForest
Hi, we implemented a QuantRegForest to be used with Spark. We coded it in scala. I don't know if you could be interrested but we offer to share it with you (btw the original implementation is in R and called quantregForest : https://cran.r-project.org/web/packages/quantregForest/index.html) Can't wait to hear from you! -- Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/ - To unsubscribe e-mail: dev-unsubscr...@spark.apache.org