Re: Unable to change job manager port when launching session cluster on Docker

2019-10-18 Thread Aleksey Pak
Hi Konstantinos,

Can you try using "8081:5000" as port binding configuration?
This should bind container's 8081 port to 5000 port on host.
If you want to use 5000 port as JobManager's port in the *container*, you
would need to change flink-conf.yaml or pass additional command line
argument (to override the corresponding config option).

Regards,
Aleksey

On Fri, Oct 18, 2019 at 2:55 PM Papadopoulos, Konstantinos <
konstantinos.papadopou...@iriworldwide.com> wrote:

> Hello all,
>
>
>
> I am trying to launch an Apache Flink session cluster on Docker using
> Docker Compose and following the respective tutorial:
>
>
> https://ci.apache.org/projects/flink/flink-docs-stable/ops/deployment/docker.html#flink-with-docker-compose
>
> The default job manager port (i.e., 8081) is in use on my host so the
> cluster fails to launch.
>
> I tried to change the configuration from the respective service definition
> on docker-compose.yml (e.g., services -> jobmanager -> ports: "5000:5000")
> with no success; job manager container seems to launch on the default port.
>
> Can anybody help me to proceed with this?
>
>
>
> Thanks in advance,
>
> Konstantinos
>
>
>
> P.S.: My docker-compose.yml is the following:
>
>
>
> version: "2.1"
>
> services:
>
>   jobmanager:
>
> image: ${FLINK_DOCKER_IMAGE_NAME:-flink}
>
> expose:
>
>   - "6123"
>
> ports:
>
>   - "5000:5000"
>
> command: jobmanager
>
> environment:
>
>   - JOB_MANAGER_RPC_ADDRESS=jobmanager
>
>
>
>   taskmanager:
>
> image: ${FLINK_DOCKER_IMAGE_NAME:-flink}
>
> expose:
>
>   - "6121"
>
>   - "6122"
>
> depends_on:
>
>   - jobmanager
>
> command: taskmanager
>
> links:
>
>   - "jobmanager:jobmanager"
>
> environment:
>
>   - JOB_MANAGER_RPC_ADDRESS=jobmanager
>


Re: Flink metrics reporters documentation

2019-10-10 Thread Aleksey Pak
Hi Flavio,

Below is my explanation to your question, based on anecdotal evidence:

As you may know, Flink distribution package is already scala version
specific and bundles some jar artifacts.
User Flink job is supposed to be compiled against some of those jars (with
maven's `provided` scope). For example, it can be Flink CEP library.
In such cases, jar names are usually preserved as is (so you would
reference the same artifact dependency name in your application build and
when you want to copy it from `/opt` to `/lib` folder).

Some of the jars are not supposed to be used by your application directly,
but rather as "plugins" in your Flink cluster (here I mean "plugins" in a
more broader sense, than plugins mechanism used by file systems introduced
in Flink 1.9).
File systems, metrics reporters are good candidates for this. The reason
that original jar artifacts are scala version specific is rather
"incidental" (imo) - it just happens that they may depend on some core
Flink libraries that still have scala code.
In practice the implementation of those libraries is not scala dependent,
but to be strict (and safe) they are built separately for different scala
versions (what you see in the maven central).

My understanding, that one of the goals to move scala away from core
libraries (to some api level library) - this should make some of the
component builds scala independent.
Removal of scala version for those jars in the distribution is probably
done with the future plan in mind (so that it stays the same user
experience).

Regards,
Aleksey


On Thu, Oct 10, 2019 at 10:59 AM Flavio Pompermaier 
wrote:

> Sorry,
> I just discovered that those jars are actually in the opt folder within
> Flink dist..however the second point still holds: why there's a single
> influxdb jar inside flink's opt jar while on maven there are 2 versions
> (one for scala 2.11 and one for 2.12)?
>
> Best,
> Flavio
>
> On Thu, Oct 10, 2019 at 10:49 AM Flavio Pompermaier 
> wrote:
>
>> Hi to all,
>> I was trying to configure monitoring on my cluster so I went to the
>> metric reporters documentation.
>> There are 2 things that are not clear to me:
>>
>>1. In all reporters the documentation says to take the jars from /opt
>>folder..obviously this is not true. Wouldn't be better to provide a link 
>> to
>>the jar directly (on Maven Central for example)?
>>2. If you look to influxdb dependency the documentation says to use
>>flink-metrics-influxdb-1.9.0.jar but there's no such "unified" jar, on
>>maven central there are two version: 1 for scala 2.11 and one for scala 
>> 2.12
>>
>> Should I open 2 JIRA tickets to improve those 2 aspects (if I'm not
>> wrong..)?
>>
>> Best,
>> Flavio
>>
>
>


Fwd: Loading dylibs

2019-08-28 Thread Aleksey Pak
Hi Vishwas,

There is a known issue in the Flink Jira project [1].
Is it possible that you have encountered the same problem?

[1]: https://issues.apache.org/jira/browse/FLINK-11402

Regards,
Aleksey


On Tue, Aug 27, 2019 at 8:03 AM Vishwas Siravara 
wrote:

> Hi Jörn,
> I tried that. Here is my snippet :
>
> String[] loadedlibs =  
> getLoadedLibraries(Thread.currentThread().getContextClassLoader());
> if(!containsVibeSimpleLib(loadedlibs)) {
> System.loadLibrary("vibesimplejava");
> }
>
> Now I get the exception Unexpected errorjava.lang.UnsatisfiedLinkError:
> com.voltage.securedata.enterprise.ConstantsNative.DIGEST_MD5()I which means
> that it could not find vibesimplejava in the loaded libs but I know that
> the if was not executed because vibesimplejava was present in loadedlibs(
> the control never went inside the if block. Any other suggestions?
>
> Thanks,
> Vishwas
>
>
>
>
>
>
> On Tue, Aug 27, 2019 at 12:25 AM Jörn Franke  wrote:
>
>> I don’t know Dylibs in detail, but can you call a static method where it
>> checks if it has been already executed and if not then it loads the library
>> (Singleton pattern)?
>>
>> Am 27.08.2019 um 06:39 schrieb Vishwas Siravara :
>>
>> Hi guys,
>> I have a flink application that loads a dylib like this
>>
>> System.loadLibrary("vibesimplejava");
>>
>>
>> The application runs fine , when I restart the job I get this exception :
>>
>> com.visa.aip.cryptolib.aipcyptoclient.EncryptionException: Unexpected 
>> errorjava.lang.UnsatisfiedLinkError: Native Library 
>> /usr/mware/SimpleAPI/voltage-simple-api-java-05.12.-Linux-x86_64-64b-r234867/lib/libvibesimplejava.so
>>  already loaded in another classloader
>>
>> This happens because the dylib has already been loaded once by the
>> taskmanger, how can I mitigate this? It seems problematic if two
>> applications are loading the same dylib.
>>
>> Thanks,
>> Vishwas
>>
>>