RE: [ANNOUNCE] New committer: Dawid Wysakowicz

2017-06-20 Thread Vasudevan, Ramkrishna S
Congratulations !! 

-Original Message-
From: Henry Saputra [mailto:henry.sapu...@gmail.com] 
Sent: Tuesday, June 20, 2017 2:19 PM
To: dev@flink.apache.org
Subject: Re: [ANNOUNCE] New committer: Dawid Wysakowicz

Congrats and welcome! =)

- Henry

On Mon, Jun 19, 2017 at 6:55 PM, SHI Xiaogang 
wrote:

> Congrats  Dawid.
> Great thanks for your contribution!
>
> Xiaogang
>
> 2017-06-19 18:52 GMT+08:00 Dawid Wysakowicz :
>
> > Thank you all for the warm welcome. I will do my best to be as 
> > helpful as possible.
> >
>


RE: [ANNOUNCE] Welcome Jark Wu and Kostas Kloudas as committers

2017-02-15 Thread Vasudevan, Ramkrishna S
Congrats Jark and Kostas.

Regards
Ram

-Original Message-
From: Kostas Kloudas [mailto:k.klou...@data-artisans.com] 
Sent: Monday, February 13, 2017 3:06 PM
To: dev@flink.apache.org
Subject: Re: [ANNOUNCE] Welcome Jark Wu and Kostas Kloudas as committers

Thanks a lot guys! 

Looking forward to working closer with all of you to keep Flink as the most 
advanced open-source stream processor!

Kostas

> On Feb 10, 2017, at 5:07 PM, Henry Saputra  wrote:
> 
> Awesome! Congrats and Welcome, Jark and Kostas K.
> 
> - Henry
> 
> On Tue, Feb 7, 2017 at 12:16 PM, Fabian Hueske  wrote:
> 
>> Hi everybody,
>> 
>> I'm very happy to announce that Jark Wu and Kostas Kloudas accepted 
>> the invitation of the Flink PMC to become committers of the Apache 
>> Flink project.
>> 
>> Jark and Kostas are longtime members of the Flink community.
>> Both are actively driving Flink's development and contributing to its 
>> community in many ways.
>> 
>> Please join me in welcoming Kostas and Jark as committers.
>> 
>> Fabian
>> 



RE: [ANNOUNCE] Welcome Stefan Richter as a new committer

2017-02-15 Thread Vasudevan, Ramkrishna S
Congrats Stefan Richter.

Regards
Ram

-Original Message-
From: Stefan Richter [mailto:s.rich...@data-artisans.com] 
Sent: Monday, February 13, 2017 2:50 PM
To: dev@flink.apache.org
Cc: u...@flink.apache.org; srich...@apache.org
Subject: Re: [ANNOUNCE] Welcome Stefan Richter as a new committer

Thanks a lot! I feel very happy and will try help the Flink community as good 
as I can :-)

Best,
Stefan 

> Am 10.02.2017 um 11:00 schrieb Ufuk Celebi :
> 
> Hey everyone,
> 
> I'm very happy to announce that the Flink PMC has accepted Stefan 
> Richter to become a committer of the Apache Flink project.
> 
> Stefan is part of the community for almost a year now and worked on 
> major features of the latest 1.2 release, most notably rescaling and 
> backwards compatibility of program state.
> 
> Please join me in welcoming Stefan. :-)
> 
> – Ufuk



RE: Naive question

2016-01-12 Thread Vasudevan, Ramkrishna S

This is scala IDE Release 4.4.0. So without doing mvn eclipse:eclipse - how to 
you import the project directly? 

Regards
Ram
-Original Message-
From: Chiwan Park [mailto:chiwanp...@apache.org] 
Sent: Tuesday, January 12, 2016 4:54 PM
To: dev@flink.apache.org
Subject: Re: Naive question

Because I tested with Scala IDE 4.3.0 only, the process in the documentation is 
slightly different with my experience.

> On Jan 12, 2016, at 8:21 PM, Stephan Ewen <se...@apache.org> wrote:
> 
> @Chiwan: Is this still up to date from your experience?
> 
> https://ci.apache.org/projects/flink/flink-docs-release-0.10/internals
> /ide_setup.html
> 
> On Tue, Jan 12, 2016 at 12:04 PM, Chiwan Park <chiwanp...@apache.org> wrote:
> 
>> Hi Ram,
>> 
>> Because there are some Scala IDE (Eclipse) plugins needed, I 
>> recommend to avoid `mvn eclipse:eclipse` command. Could you try just 
>> run `mvn clean install -DskipTests` and import the project to Scala 
>> IDE directly? In middle of importing process, Scala IDE suggests some 
>> plugins needed.
>> 
>> And which version of Scala IDE you are using?
>> 
>>> On Jan 12, 2016, at 7:58 PM, Vasudevan, Ramkrishna S <
>> ramkrishna.s.vasude...@intel.com> wrote:
>>> 
>>> Yes. I added it as Maven project only. I did mvn eclipse:eclipse to
>> create the project and also built the code using mvn clean install 
>> -DskipTests.
>>> 
>>> Regards
>>> Ram
>>> 
>>> -Original Message-
>>> From: ewenstep...@gmail.com [mailto:ewenstep...@gmail.com] On Behalf 
>>> Of
>> Stephan Ewen
>>> Sent: Tuesday, January 12, 2016 4:10 PM
>>> To: dev@flink.apache.org
>>> Subject: Re: Naive question
>>> 
>>> Sorry to hear that it did not work out with Eclipse at all in the 
>>> end,
>> even with all adjustments.
>>> 
>>> Just making sure: You imported Flink as a Maven project, not 
>>> manually
>> adding the big Flink dependency JAR?
>>> 
>>> On Tue, Jan 12, 2016 at 5:15 AM, Vasudevan, Ramkrishna S <
>> ramkrishna.s.vasude...@intel.com> wrote:
>>> 
>>>> Thanks to all. I tried with Scala Eclipse IDE with all these 
>>>> 'change-scala-version.sh'. But in vain.
>>>> 
>>>> So I switched over to Intellij and thing work fine over there. I am 
>>>> new to Intellij so will try using it.
>>>> 
>>>> Once again thanks for helping me out.
>>>> 
>>>> Regards
>>>> Ram
>>>> 
>>>> -Original Message-
>>>> From: Chiwan Park [mailto:chiwanp...@apache.org]
>>>> Sent: Monday, January 11, 2016 4:37 PM
>>>> To: dev@flink.apache.org
>>>> Subject: Re: Naive question
>>>> 
>>>> Hi Ram,
>>>> 
>>>> If you want to build Flink with Scala 2.10, just checkout Flink 
>>>> repository from github or download source code from homepage, run 
>>>> `mvn clean install -DskipTests` and import projects to your IDE. If 
>>>> you want to build Flink with Scala 2.11, you have to run 
>>>> `tools/change-scala-version.sh 2.11` before build the project. You 
>>>> can revert Scala version change by running 
>>>> `tools/change-scala-version.sh
>> 2.10`.
>>>> 
>>>> About IDE, Flink community recommends IntelliJ IDEA because Scala 
>>>> IDE have some problems in Java/Scala mixed project like Flink. But 
>>>> I tested importing Flink project with Scala IDE 4.3.0, Scala 2.11.7 
>>>> and Flink 0.10.0 source code. Note that you should import the 
>>>> project as
>> maven project.
>>>> 
>>>> By the way, the community welcomes any questions. Please feel free 
>>>> to post questions. :)
>>>> 
>>>>> On Jan 11, 2016, at 7:30 PM, Vasudevan, Ramkrishna S <
>>>> ramkrishna.s.vasude...@intel.com> wrote:
>>>>> 
>>>>> Thank you very much for the reply.
>>>>> I tried different ways and when I tried setting up the root 
>>>>> pom.xml to
>>>>> 2.11
>>>>> 
>>>>> 2.11.6
>>>>> 2.11
>>>>> 
>>>>> I got the following error
>>>>> [INFO]
>>>>> --
>>>>> --
>>>>> --
>>>>> -- [ERROR] Failed to execute goal on project flink-scala: Could 
>>>

RE: Naive question

2016-01-11 Thread Vasudevan, Ramkrishna S
Thank you very much for the reply. 
I tried different ways and when I tried setting up the root pom.xml to 2.11 

2.11.6
2.11

I got the following error
[INFO] 
[ERROR] Failed to execute goal on project flink-scala: Could not resolve depende
ncies for project org.apache.flink:flink-scala:jar:1.0-SNAPSHOT: Could not find
artifact org.scalamacros:quasiquotes_2.11:jar:2.0.1 in central (http://repo.mave
n.apache.org/maven2) -> [Help 1]

If I leave the scala.binary.verson to be at 2.10 and the scala version to be at 
2.11.6 then I get the following problem
[INFO] C:\flink\flink\flink-runtime\src\test\scala:-1: info: compiling
[INFO] Compiling 366 source files to C:\flink\flink\flink-runtime\target\test-cl
asses at 1452508064750
[ERROR] C:\flink\flink\flink-runtime\src\test\scala\org\apache\flink\runtime\job
manager\JobManagerITCase.scala:700: error: can't expand macros compiled by previ
ous versions of Scala
[ERROR]   assert(cachedGraph2.isArchived)
[ERROR]   ^

So am not pretty sure how to proceed with this. If I try to change the version 
of scala to 2.10 in the IDE then I get lot of compilation issues.  IS there any 
way to over come this?

Once again thanks a lot and apologies for the naïve question.

Regards
Ram
-Original Message-
From: ewenstep...@gmail.com [mailto:ewenstep...@gmail.com] On Behalf Of Stephan 
Ewen
Sent: Friday, January 8, 2016 5:01 PM
To: dev@flink.apache.org
Subject: Re: Naive question

Hi!

This looks like a mismatch between the Scala dependency in Flink and Scala in 
your Eclipse. Make sure you use the same for both. By default, Flink reference 
Scala 2.10

If your IDE is set up for Scala 2.11, set the Scala version variable in the 
Flink root pom.xml also to 2.11

Greetings,
Stephan




On Fri, Jan 8, 2016 at 12:06 PM, Vasudevan, Ramkrishna S < 
ramkrishna.s.vasude...@intel.com> wrote:

> I have been trying to install, learn and understand Flink. I am using
> Scala- EclipseIDE as my IDE.
>
> I have downloaded the flink source coded, compiled and created the project.
>
> My work laptop is Windows based and I don't have eclipse based 
> workstation but I do have linux boxes for running and testing things.
>
> Some of the examples given in Flink source code do run directly from 
> Eclipse but when I try to run the Wordcount example from Eclipse I get 
> this error
>
> Exception in thread "main" java.lang.NoSuchMethodError:
> scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
>  at akka.actor.ActorCell$.(ActorCell.scala:336)
>  at akka.actor.ActorCell$.(ActorCell.scala)
>  at akka.actor.RootActorPath.$div(ActorPath.scala:159)
>  at akka.actor.LocalActorRefProvider.(ActorRefProvider.scala:464)
>  at akka.actor.LocalActorRefProvider.(ActorRefProvider.scala:452)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown
> Source)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown
> Source)
>  at java.lang.reflect.Constructor.newInstance(Unknown Source)
>  at
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
>  at scala.util.Try$.apply(Try.scala:191)
>  at
> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
>  at
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
>  at
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
>  at scala.util.Success.flatMap(Try.scala:230)
>  at
> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
>  at akka.actor.ActorSystemImpl.liftedTree1$1(ActorSystem.scala:585)
>  at akka.actor.ActorSystemImpl.(ActorSystem.scala:578)
>  at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
>  at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
>  at akka.actor.ActorSystem$.create(ActorSystem.scala:67)
>  at
> org.apache.flink.runtime.akka.AkkaUtils$.createActorSystem(AkkaUtils.scala:84)
>  at
> org.apache.flink.runtime.minicluster.FlinkMiniCluster.startJobManagerActorSystem(FlinkMiniCluster.scala:196)
>  at
> org.apache.flink.runtime.minicluster.FlinkMiniCluster.singleActorSystem$lzycompute$1(FlinkMiniCluster.scala:225)
>  at org.apache.flink.runtime.minicluster.FlinkMiniCluster.org
> $apache$flink$runtime$minicluster$FlinkMiniCluster$$singleActorSystem$1(FlinkMiniCluster.scala:225)
>  at
> org.apache.flink.runtime.minicluster.FlinkMiniCluster$$anonfun$1.apply(FlinkMiniCluster.scala:230)
>  at
> org

RE: Naive question

2016-01-11 Thread Vasudevan, Ramkrishna S
Thanks to all. I tried with Scala Eclipse IDE with all these 
'change-scala-version.sh'. But in vain. 

So I switched over to Intellij and thing work fine over there. I am new to 
Intellij so will try using it.  

Once again thanks for helping me out.

Regards
Ram

-Original Message-
From: Chiwan Park [mailto:chiwanp...@apache.org] 
Sent: Monday, January 11, 2016 4:37 PM
To: dev@flink.apache.org
Subject: Re: Naive question

Hi Ram,

If you want to build Flink with Scala 2.10, just checkout Flink repository from 
github or download source code from homepage, run `mvn clean install 
-DskipTests` and import projects to your IDE. If you want to build Flink with 
Scala 2.11, you have to run `tools/change-scala-version.sh 2.11` before build 
the project. You can revert Scala version change by running 
`tools/change-scala-version.sh 2.10`.

About IDE, Flink community recommends IntelliJ IDEA because Scala IDE have some 
problems in Java/Scala mixed project like Flink. But I tested importing Flink 
project with Scala IDE 4.3.0, Scala 2.11.7 and Flink 0.10.0 source code. Note 
that you should import the project as maven project.

By the way, the community welcomes any questions. Please feel free to post 
questions. :)

> On Jan 11, 2016, at 7:30 PM, Vasudevan, Ramkrishna S 
> <ramkrishna.s.vasude...@intel.com> wrote:
> 
> Thank you very much for the reply. 
> I tried different ways and when I tried setting up the root pom.xml to 
> 2.11
> 
>   2.11.6
>   2.11
> 
> I got the following error
> [INFO] 
> --
> -- [ERROR] Failed to execute goal on project flink-scala: Could not 
> resolve depende ncies for project 
> org.apache.flink:flink-scala:jar:1.0-SNAPSHOT: Could not find artifact 
> org.scalamacros:quasiquotes_2.11:jar:2.0.1 in central 
> (http://repo.mave
> n.apache.org/maven2) -> [Help 1]
> 
> If I leave the scala.binary.verson to be at 2.10 and the scala version 
> to be at 2.11.6 then I get the following problem [INFO] 
> C:\flink\flink\flink-runtime\src\test\scala:-1: info: compiling [INFO] 
> Compiling 366 source files to 
> C:\flink\flink\flink-runtime\target\test-cl
> asses at 1452508064750
> [ERROR] 
> C:\flink\flink\flink-runtime\src\test\scala\org\apache\flink\runtime\j
> ob
> manager\JobManagerITCase.scala:700: error: can't expand macros 
> compiled by previ ous versions of Scala
> [ERROR]   assert(cachedGraph2.isArchived)
> [ERROR]   ^
> 
> So am not pretty sure how to proceed with this. If I try to change the 
> version of scala to 2.10 in the IDE then I get lot of compilation issues.  IS 
> there any way to over come this?
> 
> Once again thanks a lot and apologies for the naïve question.
> 
> Regards
> Ram
> -Original Message-
> From: ewenstep...@gmail.com [mailto:ewenstep...@gmail.com] On Behalf 
> Of Stephan Ewen
> Sent: Friday, January 8, 2016 5:01 PM
> To: dev@flink.apache.org
> Subject: Re: Naive question
> 
> Hi!
> 
> This looks like a mismatch between the Scala dependency in Flink and 
> Scala in your Eclipse. Make sure you use the same for both. By 
> default, Flink reference Scala 2.10
> 
> If your IDE is set up for Scala 2.11, set the Scala version variable 
> in the Flink root pom.xml also to 2.11
> 
> Greetings,
> Stephan
> 
> 
> 
> 
> On Fri, Jan 8, 2016 at 12:06 PM, Vasudevan, Ramkrishna S < 
> ramkrishna.s.vasude...@intel.com> wrote:
> 
>> I have been trying to install, learn and understand Flink. I am using
>> Scala- EclipseIDE as my IDE.
>> 
>> I have downloaded the flink source coded, compiled and created the project.
>> 
>> My work laptop is Windows based and I don't have eclipse based 
>> workstation but I do have linux boxes for running and testing things.
>> 
>> Some of the examples given in Flink source code do run directly from 
>> Eclipse but when I try to run the Wordcount example from Eclipse I 
>> get this error
>> 
>> Exception in thread "main" java.lang.NoSuchMethodError:
>> scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
>> at akka.actor.ActorCell$.(ActorCell.scala:336)
>> at akka.actor.ActorCell$.(ActorCell.scala)
>> at akka.actor.RootActorPath.$div(ActorPath.scala:159)
>> at akka.actor.LocalActorRefProvider.(ActorRefProvider.scala:464)
>> at akka.actor.LocalActorRefProvider.(ActorRefProvider.scala:452)
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown
>> Source)
>> at
>> sun.reflect