[ 
https://issues.apache.org/jira/browse/FLINK-3511?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15169244#comment-15169244
 ] 

ASF GitHub Bot commented on FLINK-3511:
---------------------------------------

GitHub user tillrohrmann opened a pull request:

    https://github.com/apache/flink/pull/1725

    [FLINK-3511] Create example module for Gelly's examples and move connector 
examples to test scope

    This PR creates an example module for Gelly's examples. The new module sets 
the required dependencies to compile scope so that the examples can be easily 
run from within an IDE.
    
    Additionally, all the connector examples are moved to the test scope of 
each connector module.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/tillrohrmann/flink fixLibraries

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/flink/pull/1725.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #1725
    
----
commit f36ad6010da4619d381f4bdc7a3ea45f9551b577
Author: Till Rohrmann <[email protected]>
Date:   2016-02-26T13:08:02Z

    [FLINK-3511] [gelly] Introduce flink-gelly-examples module
    
    The new flink-gelly-examples module contains all Java and Scala Gelly 
examples. The module
    contains compile scope dependencies on flink-java, flink-scala and 
flink-clients so that
    the examples can be conveniently run from within the IDE.

commit 4612d68e72739f6fe0420ebc4cc6c2bbdd72a8c1
Author: Till Rohrmann <[email protected]>
Date:   2016-02-26T14:57:45Z

    [FLINK-3511] [avro] Move avro examples to test scope

commit 17561bc863481adfd9c0f9df23c666a376354c70
Author: Till Rohrmann <[email protected]>
Date:   2016-02-26T15:12:59Z

    [FLINK-3511] [hadoop-compatibility] Move hadoop-compatibility examples to 
test scope

commit a8745801364624ce13bc195751eaaac1bcccda6f
Author: Till Rohrmann <[email protected]>
Date:   2016-02-26T15:15:44Z

    [FLINK-3511] [jdbc] Move jdbc examples to test scope and add flink-clients 
dependency

commit b849f740766664459be08abb6f2a35e9266edda5
Author: Till Rohrmann <[email protected]>
Date:   2016-02-26T15:21:13Z

    [FLINK-3511] [nifi, elasticsearch] Move nifi and elasticsearch examples to 
test scope

commit 2c053f58e76bc62658f404139a9192a832084f1a
Author: Till Rohrmann <[email protected]>
Date:   2016-02-26T15:27:06Z

    [FLINK-3511] [twitter] Move twitter examples to test scope

----


> Flink library examples not runnable without adding dependencies
> ---------------------------------------------------------------
>
>                 Key: FLINK-3511
>                 URL: https://issues.apache.org/jira/browse/FLINK-3511
>             Project: Flink
>          Issue Type: Bug
>          Components: Build System
>    Affects Versions: 1.0.0
>            Reporter: Márton Balassi
>            Assignee: Till Rohrmann
>
> Recent changes to the build [1] where many libraries got their core 
> dependencies (the ones included in the flink-dist fat jar) moved to the 
> provided scope.
> The reasoning was that when submitting to the Flink cluster the application 
> already has these dependencies, while when a user writes a program against 
> these libraries she will include the core dependencies explicitly anyway.
> There is one other case of usage however, namely when someone is trying to 
> run an application defined in these libraries depending on the core jars. To 
> give an example if you were to run the Gelly ConnectedComponents example [2] 
> from an IDE after importing Flink (or running with java -jar without 
> including the flink fat jar in the classpath) you would receive the following 
> class not found exception as per the current master:
> Exception in thread "main" java.lang.NoClassDefFoundError: 
> org/apache/flink/api/common/ProgramDescription
>       at java.lang.ClassLoader.defineClass1(Native Method)
>       at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
>       at 
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>       at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>       at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       at java.lang.Class.forName0(Native Method)
>       at java.lang.Class.forName(Class.java:264)
>       at com.intellij.rt.execution.application.AppMain.main(AppMain.java:122)
> Where the class missing in runtime is in flink-core, a transitive dependency 
> of the jars moved to the provided scope.
> Funny thing is we have tests in place to run our examples, but those add test 
> scope dependencies re-adding the missing classes, so it is never discovered.
> I agree with the original purpose of PR #1683, but also think that the 
> current state makes for very inconvenient user experience.
> I would like to open a discussion on how and when to resolve the issue given 
> the release of 1.0.0.
> 1. Is it a release blocker?
> 2. Should the change be reverted or is it sufficient to have proper 
> documentation around it? Maybe a maven profile for explicitly for developing 
> Flink without the provided scope?
> Note that the issue was originally reported by Gábor Gévay.
> [1] https://github.com/apache/flink/pull/1683
> [2] 
> https://github.com/apache/flink/blob/master/flink-libraries/flink-gelly/src/main/java/org/apache/flink/graph/example/ConnectedComponents.java



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to