Github user k-mack commented on a diff in the pull request:

    https://github.com/apache/flink/pull/5900#discussion_r202062325
  
    --- Diff: docs/quickstart/java_api_quickstart.md ---
    @@ -101,23 +111,210 @@ allows to [import Maven 
projects](http://books.sonatype.com/m2eclipse-book/refer
     Some Eclipse bundles include that plugin by default, others require you
     to install it manually. 
     
    -*A note to Mac OS X users*: The default JVM heapsize for Java may be too
    +*Please note*: The default JVM heapsize for Java may be too
     small for Flink. You have to manually increase it.
    -In Eclipse, choose
    -`Run Configurations -> Arguments` and write into the `VM Arguments`
    -box: `-Xmx800m`.
    +In Eclipse, choose `Run Configurations -> Arguments` and write into the 
`VM Arguments` box: `-Xmx800m`.
     In IntelliJ IDEA recommended way to change JVM options is from the `Help | 
Edit Custom VM Options` menu. See [this 
article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties)
 for details. 
     
    -## Build Project
    +### Build Project
     
     If you want to __build/package your project__, go to your project 
directory and
     run the '`mvn clean package`' command.
     You will __find a JAR file__ that contains your application, plus 
connectors and libraries
     that you may have added as dependencies to the application: 
`target/<artifact-id>-<version>.jar`.
     
     __Note:__ If you use a different class than *StreamingJob* as the 
application's main class / entry point,
    -we recommend you change the `mainClass` setting in the `pom.xml` file 
accordingly. That way, the Flink
    -can run time application from the JAR file without additionally specifying 
the main class.
    +we recommend you change the `mainClass` setting in the `pom.xml` file 
accordingly. That way, Flink
    +can run the application from the JAR file without additionally specifying 
the main class.
    +
    +## Gradle
    +
    +### Requirements
    +
    +The only requirements are working __Gradle 3.x__ (or higher) and __Java 
8.x__ installations.
    +
    +### Create Project
    +
    +Use one of the following commands to __create a project__:
    +
    +<ul class="nav nav-tabs" style="border-bottom: none;">
    +        <li class="active"><a href="#gradle-example" 
data-toggle="tab"><strong>Gradle example</strong></a></li>
    +    <li><a href="#gradle-script" data-toggle="tab">Run the 
<strong>quickstart script</strong></a></li>
    +</ul>
    +<div class="tab-content">
    +    <div class="tab-pane active" id="gradle-example">
    +
    +        <ul class="nav nav-tabs" style="border-bottom: none;">
    +            <li class="active"><a href="#gradle-build" 
data-toggle="tab"><tt>build.gradle</tt></a></li>
    +            <li><a href="#gradle-settings" 
data-toggle="tab"><tt>settings.gradle</tt></a></li>
    +        </ul>
    +        <div class="tab-content">
    +            <div class="tab-pane active" id="gradle-build">
    +                {% highlight gradle %}
    +buildscript {
    +    repositories {
    +        jcenter() // this applies only to the Gradle 'Shadow' plugin
    +    }
    +    dependencies {
    +        classpath 'com.github.jengelman.gradle.plugins:shadow:2.0.4'
    +    }
    +}
    +
    +plugins {
    +    id 'java'
    +    id 'application'
    +    // shadow plugin to produce fat JARs
    +    id 'com.github.johnrengelman.shadow' version '2.0.4'
    +}
    +
    +
    +// artifact properties
    +group = 'org.myorg.quickstart'
    +version = '0.1-SNAPSHOT'
    +mainClassName = 'org.myorg.quickstart.StreamingJob'
    +description = """Flink Quickstart Job"""
    +
    +ext {
    +    javaVersion = '1.8'
    +    flinkVersion = '{{ site.version }}'
    +    scalaBinaryVersion = '{{ site.scala_version }}'
    +    slf4jVersion = '1.7.7'
    +    log4jVersion = '1.2.17'
    +}
    +
    +
    +sourceCompatibility = javaVersion
    +targetCompatibility = javaVersion
    +tasks.withType(JavaCompile) {
    +   options.encoding = 'UTF-8'
    +}
    +
    +applicationDefaultJvmArgs = ["-Dlog4j.configuration=log4j.properties"]
    +
    +// declare where to find the dependencies of your project
    +repositories {
    +    mavenCentral()
    +    maven { url 
"https://repository.apache.org/content/repositories/snapshots/"; }
    +}
    +
    +// NOTE: We cannot use "compileOnly" or "shadow" configurations since then 
we could not run code
    +// in the IDE or with "gradle run". We also cannot exclude transitive 
dependencies from the
    +// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
    +// -> Explicitly define the // libraries we want to be included in the 
"flinkShadowJar" configuration!
    +configurations {
    +    flinkShadowJar // dependencies which go into the shadowJar
    +
    +    // always exclude these (also from transitive dependencies) since they 
are provided by Flink
    +    flinkShadowJar.exclude group: 'org.apache.flink', module: 
'force-shading'
    +    flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 
'jsr305'
    +    flinkShadowJar.exclude group: 'org.slf4j'
    +    flinkShadowJar.exclude group: 'log4j'
    +}
    +
    +// declare the dependencies for your production and test code
    +dependencies {
    +    compile "org.apache.flink:flink-java:${flinkVersion}"
    +    compile 
"org.apache.flink:flink-streaming-java_${scalaBinaryVersion}:${flinkVersion}"
    +
    +    // Add connector dependencies here.
    +    // They must be in the flinkShadowJar configuration in order to be 
included into the shaded jar.
    +    //flinkShadowJar 
"org.apache.flink:flink-connector-kafka-0.11_${scalaBinaryVersion}:${flinkVersion}"
    +
    +    compile "log4j:log4j:${log4jVersion}"
    +    compile "org.slf4j:slf4j-log4j12:${slf4jVersion}"
    +
    +    // Add test dependencies here.
    +    // testCompile "junit:junit:4.12"
    +}
    +
    +// make compileOnly dependencies available for tests:
    +sourceSets {
    +    main.runtimeClasspath += configurations.flinkShadowJar
    --- End diff --
    
    I think the only thing left is to update the javadoc task's classpath to 
avoid JavaDoc errors, which would happen if authors reference classes 
associated with the flinkShadowJar configuration in their JavaDocs. I do 
something similar my in my build scripts for Hadoop applications:
    
    ```gradle
    // Prevent JavaDoc errors by including the Hadoop dependencies on the 
javadoc classpath
    javadoc.classpath += configurations.hadoopJar
    ```


---

Reply via email to