Re: Cutting the RC for Spark 2.2.1 release

2017-11-13 Thread Felix Cheung
Quick update: We merged 6 fixes Friday and 7 fixes today (thanks!), since some are hand-merged I’m waiting for clean builds from Jenkins and test passes. As of now it looks like we need to take one more fix for Scala 2.10. With any luck we should be tagging for build tomorrow morning (PT).

Re: Timeline for Spark 2.3

2017-11-13 Thread dji...@dataxu.com
Hi, What is the process to request an issue/fix to be included in the next release? Is there a place to vote for features? I am interested in https://issues.apache.org/jira/browse/SPARK-13127, to see if we can get Spark upgrade parquet to 1.9.0, which addresses the

Re: Cutting the RC for Spark 2.2.1 release

2017-11-13 Thread Felix Cheung
Anything to build with maven on a clean machine. It couldn’t connect to maven central repo. From: Holden Karau Sent: Monday, November 13, 2017 10:38:03 AM To: Felix Cheung Cc: dev@spark.apache.org Subject: Re: Cutting the RC for Spark 2.2.1

Re: Cutting the RC for Spark 2.2.1 release

2017-11-13 Thread Holden Karau
Which script is this from? On Mon, Nov 13, 2017 at 10:37 AM Felix Cheung wrote: > Build/test looks good but I’m hitting a new issue with sonatype when > tagging > > "Host name 'repo1.maven.org' does not match the certificate subject > provided by the peer

Re: Cutting the RC for Spark 2.2.1 release

2017-11-13 Thread Felix Cheung
Ouch ;) yes that works and RC1 is tagged. From: Sean Owen Sent: Monday, November 13, 2017 10:54:48 AM To: Felix Cheung Cc: Holden Karau; dev@spark.apache.org Subject: Re: Cutting the RC for Spark 2.2.1 release It's

Re: Cutting the RC for Spark 2.2.1 release

2017-11-13 Thread Felix Cheung
Build/test looks good but I’m hitting a new issue with sonatype when tagging "Host name 'repo1.maven.org' does not match the certificate subject provided by the peer (CN=repo.maven.apache.org, O="Sonatype, Inc", L=Fulton, ST=MD, C=US)" https://issues.sonatype.org/browse/MVNCENTRAL-1369 Stay

Re: Cutting the RC for Spark 2.2.1 release

2017-11-13 Thread Felix Cheung
I did change it, but getting unknown host? [ERROR] Non-resolvable parent POM for org.apache.spark:spark-parent_2.11:2.2.1-SNAPSHOT: Could not transfer artifact org.apache:apache:pom:14 from/to central (https://repo.maven.org/maven2): repo.maven.org: Name or service not

Re: Cutting the RC for Spark 2.2.1 release

2017-11-13 Thread Sean Owen
I'm not seeing a problem building, myself. However we could change the location of the Maven Repository in our POM to https://repo.maven.apache.org/maven2/ without any consequence. The only reason we overrode it was to force it to use HTTPS which still doesn't look like the default (!):

Re: Cutting the RC for Spark 2.2.1 release

2017-11-13 Thread Sean Owen
It's repo.maven.apache.org ? On Mon, Nov 13, 2017 at 12:52 PM Felix Cheung wrote: > I did change it, but getting unknown host? > > [ERROR] Non-resolvable parent POM for > org.apache.spark:spark-parent_2.11:2.2.1-SNAPSHOT: Could not transfer > artifact

Re: Reload some static data during struct streaming

2017-11-13 Thread Burak Yavuz
I think if you don't cache the jdbc table, then it should auto-refresh. On Mon, Nov 13, 2017 at 1:21 PM, spark receiver wrote: > Hi > > I’m using struct streaming(spark 2.2) to receive Kafka msg ,it works > great. The thing is I need to join the Kafka message with a

Reload some static data during struct streaming

2017-11-13 Thread spark receiver
Hi I’m using struct streaming(spark 2.2) to receive Kafka msg ,it works great. The thing is I need to join the Kafka message with a relative static table stored in mysql database (let’s call it metadata here). So is it possible to reload the metadata table after some time interval(like

Re: Reload some static data during struct streaming

2017-11-13 Thread spark receiver
I need it cached to improve throughput ,only hope it can be refreshed once a day not every batch. > On Nov 13, 2017, at 4:49 PM, Burak Yavuz wrote: > > I think if you don't cache the jdbc table, then it should auto-refresh. > > On Mon, Nov 13, 2017 at 1:21 PM, spark