Re: welcome a new batch of committers

2018-10-05 Thread Suresh Thalamati
Congratulations to all!

-suresh

On Wed, Oct 3, 2018 at 1:59 AM Reynold Xin  wrote:

> Hi all,
>
> The Apache Spark PMC has recently voted to add several new committers to
> the project, for their contributions:
>
> - Shane Knapp (contributor to infra)
> - Dongjoon Hyun (contributor to ORC support and other parts of Spark)
> - Kazuaki Ishizaki (contributor to Spark SQL)
> - Xingbo Jiang (contributor to Spark Core and SQL)
> - Yinan Li (contributor to Spark on Kubernetes)
> - Takeshi Yamamuro (contributor to Spark SQL)
>
> Please join me in welcoming them!
>
>


Re: Welcoming Tejas Patil as a Spark committer

2017-10-03 Thread Suresh Thalamati
Congratulations , Tejas!

-suresh

> On Sep 29, 2017, at 12:58 PM, Matei Zaharia  wrote:
> 
> Hi all,
> 
> The Spark PMC recently added Tejas Patil as a committer on the
> project. Tejas has been contributing across several areas of Spark for
> a while, focusing especially on scalability issues and SQL. Please
> join me in welcoming Tejas!
> 
> Matei
> 
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> 


-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] [SPIP] SPARK-15689: Data Source API V2 read path

2017-09-06 Thread Suresh Thalamati
+1 (non-binding)


> On Sep 6, 2017, at 7:29 PM, Wenchen Fan  wrote:
> 
> Hi all,
> 
> In the previous discussion, we decided to split the read and write path of 
> data source v2 into 2 SPIPs, and I'm sending this email to call a vote for 
> Data Source V2 read path only.
> 
> The full document of the Data Source API V2 is:
> https://docs.google.com/document/d/1n_vUVbF4KD3gxTmkNEon5qdQ-Z8qU5Frf6WMQZ6jJVM/edit
>  
> 
> 
> The ready-for-review PR that implements the basic infrastructure for the read 
> path is:
> https://github.com/apache/spark/pull/19136 
> 
> 
> The vote will be up for the next 72 hours. Please reply with your vote:
> 
> +1: Yeah, let's go forward and implement the SPIP.
> +0: Don't really care.
> -1: I don't think this is a good idea because of the following technical 
> reasons.
> 
> Thanks!



Re: Welcoming Saisai (Jerry) Shao as a committer

2017-08-28 Thread Suresh Thalamati
Congratulations, Jerry

> On Aug 28, 2017, at 6:28 PM, Matei Zaharia  wrote:
> 
> Hi everyone,
> 
> The PMC recently voted to add Saisai (Jerry) Shao as a committer. Saisai has 
> been contributing to many areas of the project for a long time, so it’s great 
> to see him join. Join me in thanking and congratulating him!
> 
> Matei
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> 


-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: welcoming Burak and Holden as committers

2017-01-24 Thread Suresh Thalamati
Congratulations Burak and Holden!

-suresh

> On Jan 24, 2017, at 10:13 AM, Reynold Xin  wrote:
> 
> Hi all,
> 
> Burak and Holden have recently been elected as Apache Spark committers.
> 
> Burak has been very active in a large number of areas in Spark, including 
> linear algebra, stats/maths functions in DataFrames, Python/R APIs for 
> DataFrames, dstream, and most recently Structured Streaming.
> 
> Holden has been a long time Spark contributor and evangelist. She has written 
> a few books on Spark, as well as frequent contributions to the Python API to 
> improve its usability and performance.
> 
> Please join me in welcoming the two!
> 
> 


-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: welcoming Xiao Li as a committer

2016-10-04 Thread Suresh Thalamati
Congratulations, Xiao!



> On Oct 3, 2016, at 10:46 PM, Reynold Xin  wrote:
> 
> Hi all,
> 
> Xiao Li, aka gatorsmile, has recently been elected as an Apache Spark 
> committer. Xiao has been a super active contributor to Spark SQL. Congrats 
> and welcome, Xiao!
> 
> - Reynold
> 


-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC3)

2016-09-27 Thread Suresh Thalamati

+1 (non-binding)

-suresh


> On Sep 26, 2016, at 11:11 PM, Jagadeesan As  wrote:
> 
> +1 (non binding)
>  
> Cheers,
> Jagadeesan A S
> 
> 
> 
> 
> From:Jean-Baptiste Onofré 
> To:dev@spark.apache.org
> Date:27-09-16 11:27 AM
> Subject:Re: [VOTE] Release Apache Spark 2.0.1 (RC3)
> 
> 
> 
> +1 (non binding)
> 
> Regards
> JB
> 
> On 09/27/2016 07:51 AM, Hyukjin Kwon wrote:
> > +1 (non-binding)
> >
> > 2016-09-27 13:22 GMT+09:00 Denny Lee  > >>:
> >
> > +1 on testing with Python2.
> >
> >
> > On Mon, Sep 26, 2016 at 3:13 PM Krishna Sankar  > >> wrote:
> >
> > I do run both Python and Scala. But via iPython/Python2 with my
> > own test code. Not running the tests from the distribution.
> > Cheers
> > 
> >
> > On Mon, Sep 26, 2016 at 11:59 AM, Holden Karau
> >  > >> wrote:
> >
> > I'm seeing some test failures with Python 3 that could
> > definitely be environmental (going to rebuild my virtual env
> > and double check), I'm just wondering if other people are
> > also running the Python tests on this release or if everyone
> > is focused on the Scala tests?
> >
> > On Mon, Sep 26, 2016 at 11:48 AM, Maciej Bryński
> >  > >> wrote:
> >
> > +1
> > At last :)
> >
> > 2016-09-26 19:56 GMT+02:00 Sameer Agarwal
> >  > >>:
> >
> > +1 (non-binding)
> >
> > On Mon, Sep 26, 2016 at 9:54 AM, Davies Liu
> >  >  > >> wrote:
> >
> > +1 (non-binding)
> >
> > On Mon, Sep 26, 2016 at 9:36 AM, Joseph Bradley
> >  >  > >> wrote:
> > > +1
> > >
> > > On Mon, Sep 26, 2016 at 7:47 AM, Denny Lee
> >  >  > >> wrote:
> > >>
> > >> +1 (non-binding)
> > >> On Sun, Sep 25, 2016 at 23:20 Jeff Zhang
> >  > >> wrote:
> > >>>
> > >>> +1
> > >>>
> > >>> On Mon, Sep 26, 2016 at 2:03 PM,
> > Shixiong(Ryan) Zhu
> > >>>  >  > >> wrote:
> > 
> >  +1
> > 
> >  On Sun, Sep 25, 2016 at 10:43 PM, Pete Lee
> >  >  > >>
> >  wrote:
> > >
> > > +1
> > >
> > >
> > > On Sun, Sep 25, 2016 at 3:26 PM, Herman
> > van Hövell tot Westerflier
> > >  >  > >> wrote:
> > >>
> > >> +1 (non-binding)
> > >>
> > >> On Sun, Sep 25, 2016 at 2:05 PM, Ricardo
> > Almeida
> > >>  >  > >> wrote:
> > >>>
> > >>> +1 (non-binding)
> > >>>
> > >>> Built and tested on
> > >>> - Ubuntu 16.04 / OpenJDK 1.8.0_91
> > >>> - CentOS / Oracle Java 1.7.0_55
> > >>> (-Phadoop-2.7 -Dhadoop.version=2.7.3
> 

Re: Unable to run docker jdbc integrations test ?

2016-09-09 Thread Suresh Thalamati
I agree with Josh. These tests are valuable , even if  then  can not be run on 
Jenkins due to setup issues. It will be good to run them atleast manually, when 
jdbc data source specific changes are made . Filed Jira for this problem. 

https://issues.apache.org/jira/browse/SPARK-17473



> On Sep 7, 2016, at 4:58 PM, Luciano Resende <luckbr1...@gmail.com> wrote:
> 
> That might be a reasonable and much more simpler approach to try... but if we 
> resolve these issues, we should make it part of some frequent build to make 
> sure the build don't regress and that the actual functionality don't regress 
> either. Let me look into this again...
> 
> On Wed, Sep 7, 2016 at 2:46 PM, Josh Rosen <joshro...@databricks.com 
> <mailto:joshro...@databricks.com>> wrote:
> I think that these tests are valuable so I'd like to keep them. If possible, 
> though, we should try to get rid of our dependency on the Spotify 
> docker-client library, since it's a dependency hell nightmare. Given our 
> relatively simple use of Docker here, I wonder whether we could just write 
> some simple scripting over the `docker` command-line tool instead of pulling 
> in such a problematic library.
> 
> On Wed, Sep 7, 2016 at 2:36 PM Luciano Resende <luckbr1...@gmail.com 
> <mailto:luckbr1...@gmail.com>> wrote:
> It looks like there is nobody running these tests, and after some dependency 
> upgrades in Spark 2.0 this has stopped working. I have tried to bring up this 
> but I am having some issues with getting the right dependencies loaded and 
> satisfying the docker-client expectations. 
> 
> The question then is: Does the community find value on having these tests 
> available ? Then we can focus on bringing them up and I can go push my 
> previous experiments as a WIP PR. Otherwise we should just get rid of these 
> tests.
> 
> Thoughts ?
> 
> 
> On Tue, Sep 6, 2016 at 4:05 PM, Suresh Thalamati <suresh.thalam...@gmail.com 
> <mailto:suresh.thalam...@gmail.com>> wrote:
> Hi, 
> 
> 
> I am getting the following error , when I am trying to run jdbc docker 
> integration tests on my laptop.   Any ideas , what I might be be doing wrong ?
> 
> build/mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0  -Phive-thriftserver 
> -Phive -DskipTests clean install
> build/mvn -Pdocker-integration-tests -pl :spark-docker-integration-tests_2.11 
>  compile test
> 
> Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; 
> support was removed in 8.0
> Discovery starting.
> Discovery completed in 200 milliseconds.
> Run starting. Expected test count is: 10
> MySQLIntegrationSuite:
> 
> Error:
> 16/09/06 11:52:00 INFO BlockManagerMaster: Registered BlockManager 
> BlockManagerId(driver, 9.31.117.25, 51868)
> *** RUN ABORTED ***
>   java.lang.AbstractMethodError:
>   at 
> org.glassfish.jersey.model.internal.CommonConfig.configureAutoDiscoverableProviders(CommonConfig.java:622)
>   at 
> org.glassfish.jersey.client.ClientConfig$State.configureAutoDiscoverableProviders(ClientConfig.java:357)
>   at 
> org.glassfish.jersey.client.ClientConfig$State.initRuntime(ClientConfig.java:392)
>   at 
> org.glassfish.jersey.client.ClientConfig$State.access$000(ClientConfig.java:88)
>   at 
> org.glassfish.jersey.client.ClientConfig$State$3.get(ClientConfig.java:120)
>   at 
> org.glassfish.jersey.client.ClientConfig$State$3.get(ClientConfig.java:117)
>   at 
> org.glassfish.jersey.internal.util.collection.Values$LazyValueImpl.get(Values.java:340)
>   at 
> org.glassfish.jersey.client.ClientConfig.getRuntime(ClientConfig.java:726)
>   at 
> org.glassfish.jersey.client.ClientRequest.getConfiguration(ClientRequest.java:285)
>   at 
> org.glassfish.jersey.client.JerseyInvocation.validateHttpMethodAndEntity(JerseyInvocation.java:126)
>   ...
> 16/09/06 11:52:00 INFO SparkContext: Invoking stop() from shutdown hook
> 16/09/06 11:52:00 INFO MapOutputTrackerMasterEndpoint: 
> MapOutputTrackerMasterEndpoint stopped!
> 
> 
> 
> Thanks
> -suresh
> 
> 
> 
> 
> -- 
> Luciano Resende
> http://twitter.com/lresende1975 <http://twitter.com/lresende1975>
> http://lresende.blogspot.com/ <http://lresende.blogspot.com/>
> 
> 
> -- 
> Luciano Resende
> http://twitter.com/lresende1975 <http://twitter.com/lresende1975>
> http://lresende.blogspot.com/ <http://lresende.blogspot.com/>


Unable to run docker jdbc integrations test ?

2016-09-06 Thread Suresh Thalamati
Hi, 


I am getting the following error , when I am trying to run jdbc docker 
integration tests on my laptop.   Any ideas , what I might be be doing wrong ?

build/mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0  -Phive-thriftserver 
-Phive -DskipTests clean install
build/mvn -Pdocker-integration-tests -pl :spark-docker-integration-tests_2.11  
compile test

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; 
support was removed in 8.0
Discovery starting.
Discovery completed in 200 milliseconds.
Run starting. Expected test count is: 10
MySQLIntegrationSuite:

Error:
16/09/06 11:52:00 INFO BlockManagerMaster: Registered BlockManager 
BlockManagerId(driver, 9.31.117.25, 51868)
*** RUN ABORTED ***
  java.lang.AbstractMethodError:
  at 
org.glassfish.jersey.model.internal.CommonConfig.configureAutoDiscoverableProviders(CommonConfig.java:622)
  at 
org.glassfish.jersey.client.ClientConfig$State.configureAutoDiscoverableProviders(ClientConfig.java:357)
  at 
org.glassfish.jersey.client.ClientConfig$State.initRuntime(ClientConfig.java:392)
  at 
org.glassfish.jersey.client.ClientConfig$State.access$000(ClientConfig.java:88)
  at org.glassfish.jersey.client.ClientConfig$State$3.get(ClientConfig.java:120)
  at org.glassfish.jersey.client.ClientConfig$State$3.get(ClientConfig.java:117)
  at 
org.glassfish.jersey.internal.util.collection.Values$LazyValueImpl.get(Values.java:340)
  at org.glassfish.jersey.client.ClientConfig.getRuntime(ClientConfig.java:726)
  at 
org.glassfish.jersey.client.ClientRequest.getConfiguration(ClientRequest.java:285)
  at 
org.glassfish.jersey.client.JerseyInvocation.validateHttpMethodAndEntity(JerseyInvocation.java:126)
  ...
16/09/06 11:52:00 INFO SparkContext: Invoking stop() from shutdown hook
16/09/06 11:52:00 INFO MapOutputTrackerMasterEndpoint: 
MapOutputTrackerMasterEndpoint stopped!



Thanks
-suresh



Re: Welcoming Felix Cheung as a committer

2016-08-08 Thread Suresh Thalamati
Congratulations , Felix!



> On Aug 8, 2016, at 11:15 AM, Ted Yu  wrote:
> 
> Congratulations, Felix.
> 
> On Mon, Aug 8, 2016 at 11:15 AM, Matei Zaharia  > wrote:
> Hi all,
> 
> The PMC recently voted to add Felix Cheung as a committer. Felix has been a 
> major contributor to SparkR and we're excited to have him join officially. 
> Congrats and welcome, Felix!
> 
> Matei
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org 
> 
> 
> 



Re: [VOTE] Release Apache Spark 2.0.0 (RC5)

2016-07-22 Thread Suresh Thalamati
+1 (non-binding)

Tested data source api , and jdbc data sources. 


> On Jul 19, 2016, at 7:35 PM, Reynold Xin  wrote:
> 
> Please vote on releasing the following candidate as Apache Spark version 
> 2.0.0. The vote is open until Friday, July 22, 2016 at 20:00 PDT and passes 
> if a majority of at least 3 +1 PMC votes are cast.
> 
> [ ] +1 Release this package as Apache Spark 2.0.0
> [ ] -1 Do not release this package because ...
> 
> 
> The tag to be voted on is v2.0.0-rc5 
> (13650fc58e1fcf2cf2a26ba11c819185ae1acc1f).
> 
> This release candidate resolves ~2500 issues: 
> https://s.apache.org/spark-2.0.0-jira 
> 
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc5-bin/ 
> 
> 
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc 
> 
> 
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1195/ 
> 
> 
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc5-docs/ 
> 
> 
> 
> =
> How can I help test this release?
> =
> If you are a Spark user, you can help us test this release by taking an 
> existing Spark workload and running on this release candidate, then reporting 
> any regressions from 1.x.
> 
> ==
> What justifies a -1 vote for this release?
> ==
> Critical bugs impacting major functionalities.
> 
> Bugs already present in 1.x, missing features, or bugs related to new 
> features will not necessarily block this release. Note that historically 
> Spark documentation has been published on the website separately from the 
> main release so we do not need to block the release due to documentation 
> errors either.
> 



Re: Welcoming Yanbo Liang as a committer

2016-06-04 Thread Suresh Thalamati
Congratulations, Yanbo

> On Jun 3, 2016, at 7:48 PM, Matei Zaharia  wrote:
> 
> Hi all,
> 
> The PMC recently voted to add Yanbo Liang as a committer. Yanbo has been a 
> super active contributor in many areas of MLlib. Please join me in welcoming 
> Yanbo!
> 
> Matei
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
> 


-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Welcoming two new committers

2016-02-08 Thread Suresh Thalamati
Congratulations Herman and Wenchen!

On Mon, Feb 8, 2016 at 10:59 AM, Andrew Or  wrote:

> Welcome!
>
> 2016-02-08 10:55 GMT-08:00 Bhupendra Mishra :
>
>> Congratulations to both. and welcome to group.
>>
>> On Mon, Feb 8, 2016 at 10:45 PM, Matei Zaharia 
>> wrote:
>>
>>> Hi all,
>>>
>>> The PMC has recently added two new Spark committers -- Herman van Hovell
>>> and Wenchen Fan. Both have been heavily involved in Spark SQL and Tungsten,
>>> adding new features, optimizations and APIs. Please join me in welcoming
>>> Herman and Wenchen.
>>>
>>> Matei
>>> -
>>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: dev-h...@spark.apache.org
>>>
>>>
>>
>


Re: [VOTE] Release Apache Spark 1.5.1 (RC1)

2015-09-27 Thread Suresh Thalamati
+1  (non-binding.)

Tested jdbc data source, and  some of the tpc-ds queries.