Re: Specify jobmanager port in HA mode

2015-12-21 Thread Ufuk Celebi

> On 17 Dec 2015, at 19:36, Cory Monty  wrote:
> 
> Hey Ufuk,
> 
> We can try building it, but it might be a week or so given the holidays. 
> We're currently targeting development against 1.0.0-SNAPSHOT and that release 
> is OK unless 1.0 release is more than 2-3 months away.

OK, that’s the expected timeframe for the 1.0.0 release.

> Let me know if you still want us to build your PR and test it out.

The change should be in after the Holidays, so it should be OK to wait if you 
don’t need it now.

– Ufuk



Re: Problem with passing arguments to Flink Web Submission Client

2015-12-21 Thread Filip Łęczycki
Hi,

Regarding the CLI, I have been using
>bin/flink run myJarFile.jar -f flink -i  -m 1
and it is working perfectly fine. Is there a difference between this two
ways of submitting a job ("bin/flink MyJar.jar" and "bin/flink run
MyJar.jar")?

I will open a Jira.

Best Regards,
Filip Łęczycki

Pozdrawiam,
Filip Łęczycki

2015-12-20 21:55 GMT+01:00 Matthias J. Sax :

> The bug is actually in the CLI (it's not a WebClient related issue)
>
> if you run
>
> > bin/flink myJarFile.jar -f flink -i  -m 1
>
> it also returns
>
> > Unrecognized option: -f
>
>
> -Matthias
>
> On 12/20/2015 09:37 PM, Matthias J. Sax wrote:
> > That is a bug. Can you open a JIRA for it?
> >
> > You can work around by not prefixing your flag with "-"
> >
> > -Matthias
> >
> > On 12/20/2015 12:59 PM, Filip Łęczycki wrote:
> >> Hi all,
> >>
> >> I would like get the pretty printed execution plan of my job, in order
> >> to achieve that I uploaded my jar to Flink Web Submission Client and
> >> tried to run it. However when I provide arguments for my app I receive
> >> following error:
> >>
> >> An unexpected error occurred:
> >> Unrecognized option: -f
> >> org.apache.flink.client.cli.CliArgsException: Unrecognized option: -f
> >> at
> >>
> org.apache.flink.client.cli.CliFrontendParser.parseInfoCommand(CliFrontendParser.java:296)
> >> at org.apache.flink.client.CliFrontend.info
> >> (CliFrontend.java:376)
> >> at
> org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:983)
> >> at
> >>
> org.apache.flink.client.web.JobSubmissionServlet.doGet(JobSubmissionServlet.java:171)
> >> at javax.servlet.http.HttpServlet.service(HttpServlet.java:734)
> >> at javax.servlet.http.HttpServlet.service(HttpServlet.java:847)
> >> at
> org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:532)
> >> at
> >>
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:453)
> >> at
> >>
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:227)
> >> at
> >>
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:965)
> >> at
> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:388)
> >> at
> >>
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:187)
> >> at
> >>
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:901)
> >> at
> >>
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:117)
> >> at
> org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:47)
> >> at
> >>
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:113)
> >> at org.eclipse.jetty.server.Server.handle(Server.java:348)
> >> at
> >>
> org.eclipse.jetty.server.HttpConnection.handleRequest(HttpConnection.java:596)
> >> at
> >>
> org.eclipse.jetty.server.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:1048)
> >> at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:549)
> >> at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:211)
> >> at
> org.eclipse.jetty.server.HttpConnection.handle(HttpConnection.java:425)
> >> at
> >>
> org.eclipse.jetty.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:489)
> >> at
> >>
> org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:436)
> >> at java.lang.Thread.run(Thread.java:745)
> >>
> >> Here is what i write into the "Program Arguments" box:
> >> -f flink -i  -m 1
> >>
> >> Am I doing something wrong or is this a bug and web client tries to
> >> interpret my arguments as flink options?
> >>
> >> Regards/Pozdrawiam,
> >> Filip Łęczycki
> >
>
>


Re: Serialisation problem

2015-12-21 Thread Fabian Hueske
Hi,

In your program, you apply a distinct transformation on a data set that has
a (nested) GValue[] type. Distinct requires that all fields are comparable
with each other. Therefore all fields of the data sets' type must be valid
key types.
However, Flink does not support object arrays as keys types. This is
regardless of the type of the object and includes GValue[] whether it
implements Comparable or not. Object array types can't simply be used as
keys right now :-(

I see two ways to add this functionality. In both cases you need to get a
bit into the details of Flink's type system, serialization, and comparators.

1) You implement your own type information for GValue[] arrays, plus
serializer and comparator and manually inject this information.

2) You extend ObjectArrayTypeInfo, such that it supports key operations, if
the component type supports key operations. This would require to implement
a TypeComparator and some modifications to ObjectArrayTypeInfo. This change
would also be a valuable contribution to Flink.

I recommend to have a look at Flink's other type infos, serializers, and
comparators to learn how this is done in Flink.

Best,
Fabian




2015-12-20 22:01 GMT+01:00 Abdulrahman kaitoua <
abdulrahman.kait...@outlook.com>:

>
> I still have the same problem even when i extended GValue with comparable.
> I think that the problem might be in the fact that Array[GValue] are not
> compatible and not the GValues but i do not know how to fix it in flink,
> may be some implicit ordering would work (and why the this field,
> Array[GValue], is compared). I appreciate every and any help.
>
> I still do not understand why this was not problem in previous versions.
>
> Exception in thread "main"
> org.apache.flink.api.common.InvalidProgramException: This type
> (ObjectArrayTypeInfo)
> cannot be used as key.
> at
> org.apache.flink.api.java.operators.Keys$ExpressionKeys.(Keys.java:308)
> at
> org.apache.flink.api.java.operators.DistinctOperator.(DistinctOperator.java:56)
> at org.apache.flink.api.scala.DataSet.distinct(DataSet.scala:740)
> at
> it.polimi.genomics.flink.FlinkImplementation.operator.region.GenometricMap4$.execute(GenometricMap4.scala:71)
>
> sealed trait GValue extends Serializable with Comparable[GValue] with 
> Ordered[GValue]{
>   def compare(o : GValue) : Int = {
> o match {
>   case GDouble(v) => this.asInstanceOf[GDouble].v compare v
>   case GString(v) => this.asInstanceOf[GString].v compare v
>   case GInt(v) => this.asInstanceOf[GInt].v compare v
>   case *_* => 0
> }
>   }
>   def equal(o : GValue) : Boolean = {
> o match {
>   case GInt(value) => try{value.equals(o.asInstanceOf[GInt].v)} catch { 
> case e : Throwable => false }
>   case GDouble(value) => try{value.equals(o.asInstanceOf[GDouble].v)} 
> catch { case e : Throwable => false }
>   case GString(value) => try{value.equals(o.asInstanceOf[GString].v)} 
> catch { case e : Throwable => false }
>   case GNull() => o.isInstanceOf[GNull]
>   case _ => false
> }
>   }
> override def compareTo(o: GValue): Int = {
>   o match {
> case GInt(value) => try{value.compareTo(o.asInstanceOf[GInt].v)} catch { 
> case e : Throwable => 0 }
> case GDouble(value) => try{value.compareTo(o.asInstanceOf[GDouble].v)} 
> catch { case e : Throwable => 0 }
> case GString(value) => try{value.compareTo(o.asInstanceOf[GString].v)} 
> catch { case e : Throwable => 0 }
> case GNull() => 0
> case _ => 0
>   }
> }
> }
>
>
>
>
>
> *-Abdulrahman
> Kaitoua-Ph.D.
> Candidate at Politecnico Di Milano*
>
>
>
> > Subject: Re: Serialisation problem
> > From: aljos...@apache.org
> > Date: Mon, 14 Dec 2015 10:42:22 +0100
> > To: user@flink.apache.org
>
> >
> > Hi,
> > the problem could be that GValue is not Comparable. Could you try making
> it extend Comparable (The Java Comparable).
> >
> > Cheers,
> > Aljoscha
> > > On 12 Dec 2015, at 20:43, Robert Metzger  wrote:
> > >
> > > Hi,
> > >
> > > Can you check the log output in your IDE or the log files of the Flink
> client (./bin/flink). The TypeExtractor is logging why a POJO is not
> recognized as a POJO.
> > >
> > > The log statements look like this:
> > >
> > > 20:42:43,465 INFO org.apache.flink.api.java.typeutils.TypeExtractor -
> class com.dataartisans.debug.MyPojo must have a default constructor to be
> used as a POJO.
> > >
> > >
> > >
> > > On Thu, Dec 10, 2015 at 11:24 PM, Abdulrahman kaitoua <
> abdulrahman.kait...@outlook.com> wrote:
> > >
> > >
> > > Hello,
> > >
> > > I would like to hive directions to make my code work again (thanks in
> advance). My code used to work on versions equal or less than 9.1 but when
> i included 10 or 10.1 i got the following exception.
> > >
> > > This type
> (ObjectArrayTypeInfo)
> cannot be used as key
> > >
> > > I 

Re: Configure log4j with XML files

2015-12-21 Thread Till Rohrmann
Hi Gwenhaël,

as far as I know, there is no direct way to do so. You can either adapt the
flink-daemon.sh script in line 68 to use a different configuration or you
can test whether the dynamic property -Dlog4j.configurationFile:CONFIG_FILE
overrides the -Dlog4j.confguration property. You can set the dynamic
property using Flink’s env.java.opts configuration parameter.

Cheers,
Till
​

On Mon, Dec 21, 2015 at 3:34 PM, Gwenhael Pasquiers <
gwenhael.pasqui...@ericsson.com> wrote:

> Hi everybody,
>
>
>
> Could it be possible to have a way to configure log4j with xml files ?
>
>
>
> I’ve looked into the code and it looks like the properties files names are
> hardcoded. However we have the need to use xml :
>
> -  We log everything into ELK (Elasticsearch / Logstash / Kibana)
> using SocketAppender
>
> -  Socket appender is synchronous by default and slow whole app
> if anything goes wrong with the ELK
>
> -  We usually add an AsyncAppender on top of the SocketAppender,
> but this sort of configuration is only possible using an XML config file…
>
>
>
> We’ve already ran into the issue. Everything was almost paused because the
> ELK was overloaded and extremely slow.
>
>
>
> B.R.
>
>
>
> Gwenhaël PASQUIERS
>


Re: Configure log4j with XML files

2015-12-21 Thread Robert Metzger
as an additional note: Flink is sending all files in the /lib folder to all
YARN containers. So you could place the XML file in "/lib" and override the
properties.

I think you need to delete the log4j properties from the conf/ directory,
then at least on YARN, we'll not set the -Dlog4j.configuration property

On Mon, Dec 21, 2015 at 3:58 PM, Till Rohrmann  wrote:

> Hi Gwenhaël,
>
> as far as I know, there is no direct way to do so. You can either adapt
> the flink-daemon.sh script in line 68 to use a different configuration or
> you can test whether the dynamic property
> -Dlog4j.configurationFile:CONFIG_FILE overrides the -Dlog4j.confguration
> property. You can set the dynamic property using Flink’s env.java.opts
> configuration parameter.
>
> Cheers,
> Till
> ​
>
> On Mon, Dec 21, 2015 at 3:34 PM, Gwenhael Pasquiers <
> gwenhael.pasqui...@ericsson.com> wrote:
>
>> Hi everybody,
>>
>>
>>
>> Could it be possible to have a way to configure log4j with xml files ?
>>
>>
>>
>> I’ve looked into the code and it looks like the properties files names
>> are hardcoded. However we have the need to use xml :
>>
>> -  We log everything into ELK (Elasticsearch / Logstash /
>> Kibana) using SocketAppender
>>
>> -  Socket appender is synchronous by default and slow whole app
>> if anything goes wrong with the ELK
>>
>> -  We usually add an AsyncAppender on top of the SocketAppender,
>> but this sort of configuration is only possible using an XML config file…
>>
>>
>>
>> We’ve already ran into the issue. Everything was almost paused because
>> the ELK was overloaded and extremely slow.
>>
>>
>>
>> B.R.
>>
>>
>>
>> Gwenhaël PASQUIERS
>>
>
>


Re: Problem with passing arguments to Flink Web Submission Client

2015-12-21 Thread Matthias J. Sax
Thanks for opening a JIRA.

I used "info" which yield the error (forgot it in the mail):

> bin/flink info myJarFile.jar -f flink -i  -m 1

-Matthias

On 12/21/2015 10:36 AM, Filip Łęczycki wrote:
> Hi,
> 
> Regarding the CLI, I have been using 
>>bin/flink run myJarFile.jar -f flink -i  -m 1 
> and it is working perfectly fine. Is there a difference between this two
> ways of submitting a job ("bin/flink MyJar.jar" and "bin/flink run
> MyJar.jar")?
> 
> I will open a Jira.
> 
> Best Regards,
> Filip Łęczycki
> 
> Pozdrawiam,
> Filip Łęczycki
> 
> 2015-12-20 21:55 GMT+01:00 Matthias J. Sax  >:
> 
> The bug is actually in the CLI (it's not a WebClient related issue)
> 
> if you run
> 
> > bin/flink myJarFile.jar -f flink -i  -m 1
> 
> it also returns
> 
> > Unrecognized option: -f
> 
> 
> -Matthias
> 
> On 12/20/2015 09:37 PM, Matthias J. Sax wrote:
> > That is a bug. Can you open a JIRA for it?
> >
> > You can work around by not prefixing your flag with "-"
> >
> > -Matthias
> >
> > On 12/20/2015 12:59 PM, Filip Łęczycki wrote:
> >> Hi all,
> >>
> >> I would like get the pretty printed execution plan of my job, in
> order
> >> to achieve that I uploaded my jar to Flink Web Submission Client and
> >> tried to run it. However when I provide arguments for my app I
> receive
> >> following error:
> >>
> >> An unexpected error occurred:
> >> Unrecognized option: -f
> >> org.apache.flink.client.cli.CliArgsException: Unrecognized option: -f
> >> at
> >>
> 
> org.apache.flink.client.cli.CliFrontendParser.parseInfoCommand(CliFrontendParser.java:296)
> >> at org.apache.flink.client.CliFrontend.info
> 
> >>
> (CliFrontend.java:376)
> >> at
> org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:983)
> >> at
> >>
> 
> org.apache.flink.client.web.JobSubmissionServlet.doGet(JobSubmissionServlet.java:171)
> >> at javax.servlet.http.HttpServlet.service(HttpServlet.java:734)
> >> at javax.servlet.http.HttpServlet.service(HttpServlet.java:847)
> >> at
> org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:532)
> >> at
> >>
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:453)
> >> at
> >>
> 
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:227)
> >> at
> >>
> 
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:965)
> >> at
> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:388)
> >> at
> >>
> 
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:187)
> >> at
> >>
> 
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:901)
> >> at
> >>
> 
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:117)
> >> at
> org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:47)
> >> at
> >>
> 
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:113)
> >> at org.eclipse.jetty.server.Server.handle(Server.java:348)
> >> at
> >>
> 
> org.eclipse.jetty.server.HttpConnection.handleRequest(HttpConnection.java:596)
> >> at
> >>
> 
> org.eclipse.jetty.server.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:1048)
> >> at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:549)
> >> at
> org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:211)
> >> at
> org.eclipse.jetty.server.HttpConnection.handle(HttpConnection.java:425)
> >> at
> >>
> 
> org.eclipse.jetty.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:489)
> >> at
> >>
> 
> org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:436)
> >> at java.lang.Thread.run(Thread.java:745)
> >>
> >> Here is what i write into the "Program Arguments" box:
> >> -f flink -i  -m 1
> >>
> >> Am I doing something wrong or is this a bug and web client tries to
> >> interpret my arguments as flink options?
> >>
> >> Regards/Pozdrawiam,
> >> Filip Łęczycki
> >
> 
> 



signature.asc
Description: OpenPGP digital signature