zentol commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r798446772



##########
File path: docs/content/docs/dev/configuration/gradle.md
##########
@@ -0,0 +1,122 @@
+---
+title: "Using Gradle"
+weight: 3
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Gradle to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide 
will show you how to
+do so with [Gradle](https://gradle.org), an open-source general-purpose build 
tool that can be used 
+to automate tasks in the development process.
+
+## Requirements
+
+- Gradle 7.x 
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you 
import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Gradle projects via the `Gradle` plugin.
+
+Eclipse does so via the [Eclipse 
Buildship](https://projects.eclipse.org/projects/tools.buildship)
+plugin (make sure to specify a Gradle version >= 3.0 in the last step of the 
import wizard; the `shadow`
+plugin requires it). You may also use [Gradle's IDE 
integration](https://docs.gradle.org/current/userguide/userguide.html#ide-integration)
+to create project files with Gradle.
+
+**Note**: The default JVM heap size for Java may be too small for Flink and 
you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM 
Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | 
Edit Custom VM Options` menu.
+See [this 
article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties)
 for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is 
necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If 
this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is 
to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to __build/package your project__, go to your project directory and
+run the '`gradle clean shadowJar`' command.
+You will __find a JAR file__ that contains your application, plus connectors 
and libraries
+that you may have added as dependencies to the application: 
`build/libs/<project-name>-<version>-all.jar`.
+
+__Note:__ If you use a different class than *StreamingJob* as the 
application's main class / entry point,
+we recommend you change the `mainClassName` setting in the `build.gradle` file 
accordingly. That way, Flink
+can run the application from the JAR file without additionally specifying the 
main class.
+
+## Adding dependencies to the project
+
+Specify a dependency configuration in the dependencies block of your 
`build.gradle` file.
+
+For example, you can add the Kafka connector as a dependency like this:
+
+**build.gradle**
+
+```gradle
+...
+dependencies {
+    ...  
+    flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"

Review comment:
       Is flinkShadowJar specific to the Flink gradle quickstart? It isn't 
mentioned anywhere on this page.

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the 
application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. 
Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally 
for testing),
+the [Flink runtime 
library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/[email protected])
 
+must be available.
+
+The guides in this section will show you how to configure your projects via 
popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref 
"docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref 
"docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration 
topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, 
scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an 
[Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask 
you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided 
quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the 
directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+       options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url 
"https://repository.apache.org/content/repositories/snapshots/"; }
+}
+// NOTE: We cannot use "compileOnly" or "shadow" configurations since then we 
could not run code
+// in the IDE or with "gradle run". We also cannot exclude transitive 
dependencies from the
+// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
+// -> Explicitly define the // libraries we want to be included in the 
"flinkShadowJar" configuration!
+configurations {
+    flinkShadowJar // dependencies which go into the shadowJar
+    // always exclude these (also from transitive dependencies) since they are 
provided by Flink
+    flinkShadowJar.exclude group: 'org.apache.flink', module: 'force-shading'
+    flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 'jsr305'
+    flinkShadowJar.exclude group: 'org.slf4j'
+    flinkShadowJar.exclude group: 'org.apache.logging.log4j'
+}
+// declare the dependencies for your production and test code
+dependencies {
+    // --------------------------------------------------------------
+    // Compile-time dependencies that should NOT be part of the
+    // shadow jar and are provided in the lib folder of Flink
+    // --------------------------------------------------------------
+    implementation "org.apache.flink:flink-streaming-java:${flinkVersion}"
+    implementation "org.apache.flink:flink-clients:${flinkVersion}"
+    // --------------------------------------------------------------
+    // Dependencies that should be part of the shadow jar, e.g.
+    // connectors. These must be in the flinkShadowJar configuration!
+    // --------------------------------------------------------------
+    //flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    runtimeOnly "org.apache.logging.log4j:log4j-api:${log4jVersion}"
+    runtimeOnly "org.apache.logging.log4j:log4j-core:${log4jVersion}"
+    runtimeOnly "org.apache.logging.log4j:log4j-slf4j-impl:${log4jVersion}"
+    runtimeOnly "org.slf4j:slf4j-log4j12:${slf4jVersion}"
+    // Add test dependencies here.
+    // testCompile "junit:junit:4.12"
+}
+// make compileOnly dependencies available for tests:
+sourceSets {
+    main.compileClasspath += configurations.flinkShadowJar
+    main.runtimeClasspath += configurations.flinkShadowJar
+    test.compileClasspath += configurations.flinkShadowJar
+    test.runtimeClasspath += configurations.flinkShadowJar
+    javadoc.classpath += configurations.flinkShadowJar
+}
+run.classpath = sourceSets.main.runtimeClasspath
+jar {
+    manifest {
+        attributes 'Built-By': System.getProperty('user.name'),
+                'Build-Jdk': System.getProperty('java.version')

Review comment:
       Is this actually necessary?

##########
File path: docs/content/docs/dev/configuration/gradle.md
##########
@@ -0,0 +1,122 @@
+---
+title: "Using Gradle"
+weight: 3
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Gradle to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide 
will show you how to
+do so with [Gradle](https://gradle.org), an open-source general-purpose build 
tool that can be used 
+to automate tasks in the development process.
+
+## Requirements
+
+- Gradle 7.x 
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you 
import this project into
+your IDE for developing and testing.

Review comment:
       Which project folder and files? Which project should be imported?
   Are we missing a reference to the gradle quickstart?

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the 
application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. 
Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally 
for testing),
+the [Flink runtime 
library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/[email protected])
 
+must be available.
+
+The guides in this section will show you how to configure your projects via 
popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref 
"docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref 
"docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration 
topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, 
scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an 
[Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask 
you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided 
quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the 
directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+       options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url 
"https://repository.apache.org/content/repositories/snapshots/"; }

Review comment:
       We shouldn't point users to snapshot dependencies.

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the 
application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. 
Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally 
for testing),
+the [Flink runtime 
library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/[email protected])
 
+must be available.
+
+The guides in this section will show you how to configure your projects via 
popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref 
"docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref 
"docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration 
topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, 
scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an 
[Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask 
you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided 
quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the 
directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+       options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url 
"https://repository.apache.org/content/repositories/snapshots/"; }
+}
+// NOTE: We cannot use "compileOnly" or "shadow" configurations since then we 
could not run code
+// in the IDE or with "gradle run". We also cannot exclude transitive 
dependencies from the
+// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
+// -> Explicitly define the // libraries we want to be included in the 
"flinkShadowJar" configuration!
+configurations {
+    flinkShadowJar // dependencies which go into the shadowJar
+    // always exclude these (also from transitive dependencies) since they are 
provided by Flink
+    flinkShadowJar.exclude group: 'org.apache.flink', module: 'force-shading'
+    flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 'jsr305'
+    flinkShadowJar.exclude group: 'org.slf4j'
+    flinkShadowJar.exclude group: 'org.apache.logging.log4j'
+}
+// declare the dependencies for your production and test code
+dependencies {
+    // --------------------------------------------------------------
+    // Compile-time dependencies that should NOT be part of the
+    // shadow jar and are provided in the lib folder of Flink
+    // --------------------------------------------------------------
+    implementation "org.apache.flink:flink-streaming-java:${flinkVersion}"
+    implementation "org.apache.flink:flink-clients:${flinkVersion}"

Review comment:
       @matriv This is different than what you propose in 
https://github.com/apache/flink-web/pull/504/files.

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,175 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide 
will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation 
tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can 
use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you 
import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e 
plugin](http://www.eclipse.org/m2e/) 
+to [import Maven 
projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+**Note**: The default JVM heap size for Java may be too small for Flink and 
you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM 
Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | 
Edit Custom VM Options` menu.
+See [this 
article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties)
 for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is 
necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If 
this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is 
to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory 
and run the
+'`mvn clean package`' command. You will find a JAR file that contains your 
application (plus connectors
+and libraries that you may have added as dependencies to the application) 
here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the 
application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file 
accordingly so that Flink
+can run the application from the JAR file without additionally specifying the 
main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in 
between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project 
Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR 
when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the 
Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these core API dependencies should have their 
scope set to 
[*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope).
 This means that
+they are needed to compile against, but that they should not be packaged into 
the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that 
the resulting JAR
+becomes excessively large, because it also contains all Flink core 
dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR 
file clash with some of
+your own dependency versions (which is normally avoided through inverted 
classloading).
+
+To correctly package the dependencies into the application JAR, the Flink API 
dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in 
different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies 
without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do 
not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not 
built into the Flink 
+distribution, you can either add them to the classpath of the distribution or 
shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote 
cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+
+To learn more about how to deploy Flink jobs, check out the [deployment 
guide]({{< ref "docs/deployment/cli" >}}).
+
+## Template for creating an uber/fat JAR with dependencies
+
+To build an application JAR that contains all dependencies required for 
declared connectors and libraries,
+you can use the following shade plugin definition:
+
+```xml
+<build>
+    <plugins>
+        <plugin>
+            <groupId>org.apache.maven.plugins</groupId>
+            <artifactId>maven-shade-plugin</artifactId>
+            <version>3.2.4</version>
+            <executions>
+                <execution>
+                    <phase>package</phase>
+                    <goals>
+                        <goal>shade</goal>
+                    </goals>
+                    <configuration>
+                        <artifactSet>
+                            <excludes>
+                                
<exclude>com.google.code.findbugs:jsr305</exclude>
+                                <exclude>org.slf4j:*</exclude>
+                                <exclude>log4j:*</exclude>

Review comment:
       This should be unnecessary because all Flink dependencies have a 
provided dependency on slf4j and log4j.
   Also, the log4j groupId is outdated (`org.apache.logging.log4j` would be the 
correct choice).

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,175 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide 
will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation 
tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can 
use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you 
import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e 
plugin](http://www.eclipse.org/m2e/) 
+to [import Maven 
projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+**Note**: The default JVM heap size for Java may be too small for Flink and 
you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM 
Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | 
Edit Custom VM Options` menu.
+See [this 
article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties)
 for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is 
necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If 
this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is 
to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory 
and run the
+'`mvn clean package`' command. You will find a JAR file that contains your 
application (plus connectors
+and libraries that you may have added as dependencies to the application) 
here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the 
application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file 
accordingly so that Flink
+can run the application from the JAR file without additionally specifying the 
main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in 
between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project 
Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR 
when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the 
Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these core API dependencies should have their 
scope set to 
[*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope).
 This means that
+they are needed to compile against, but that they should not be packaged into 
the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that 
the resulting JAR
+becomes excessively large, because it also contains all Flink core 
dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR 
file clash with some of
+your own dependency versions (which is normally avoided through inverted 
classloading).
+
+To correctly package the dependencies into the application JAR, the Flink API 
dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in 
different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies 
without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do 
not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not 
built into the Flink 
+distribution, you can either add them to the classpath of the distribution or 
shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote 
cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+
+To learn more about how to deploy Flink jobs, check out the [deployment 
guide]({{< ref "docs/deployment/cli" >}}).
+
+## Template for creating an uber/fat JAR with dependencies
+
+To build an application JAR that contains all dependencies required for 
declared connectors and libraries,
+you can use the following shade plugin definition:
+
+```xml

Review comment:
       Overall I find it weird that we don't just link to the quickstarts. They 
are ultimately the source of truth of how a recommended project looks like, and 
are also tested.

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the 
application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. 
Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally 
for testing),
+the [Flink runtime 
library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/[email protected])
 

Review comment:
       This is oddly specific, technically incorrect (you need way more than 
flink-runtime), it shouldn't contain a hard-coded version, and the link is 
quite weird (why not maven central?)

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the 
application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. 
Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally 
for testing),
+the [Flink runtime 
library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/[email protected])
 
+must be available.
+
+The guides in this section will show you how to configure your projects via 
popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref 
"docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref 
"docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration 
topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, 
scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an 
[Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask 
you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided 
quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the 
directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'

Review comment:
       As is this property is unused, and none of the instructions on this page 
reference it.

##########
File path: docs/content/docs/dev/configuration/advanced.md
##########
@@ -0,0 +1,148 @@
+---
+title: "Advanced Configuration"
+weight: 10
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Advanced Configuration Topics
+
+## Dependencies: Flink Core and User Application
+
+There are two broad categories of dependencies and libraries in Flink, which 
are explained below.
+
+### Flink Core Dependencies
+
+Flink itself consists of a set of classes and dependencies that form the core 
of Flink's runtime
+and must be present when a Flink application is started. The classes and 
dependencies needed to run
+the system handle areas such as coordination, networking, checkpointing, 
failover, APIs,
+operators (such as windowing), resource management, etc.
+
+These core classes and dependencies are packaged in the `flink-dist` jar, are 
part of Flink's `lib`
+folder, and part of the basic Flink container images. You can think of these 
dependencies as similar
+to Java's core library (i.e. `rt.jar`, `charsets.jar`), which contains classes 
like `String` and `List`.

Review comment:
       rt.jar doesn't exist in JDK 9+. I would just remove the references to 
any jars.

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the 
application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. 
Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally 
for testing),
+the [Flink runtime 
library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/[email protected])
 
+must be available.
+
+The guides in this section will show you how to configure your projects via 
popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref 
"docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref 
"docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration 
topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, 
scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an 
[Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask 
you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided 
quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the 
directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'

Review comment:
       `myorg.org` points to an actively used domain. I would suggest to just 
remove it.

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,175 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide 
will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation 
tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can 
use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you 
import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e 
plugin](http://www.eclipse.org/m2e/) 
+to [import Maven 
projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+**Note**: The default JVM heap size for Java may be too small for Flink and 
you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM 
Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | 
Edit Custom VM Options` menu.
+See [this 
article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties)
 for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is 
necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If 
this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is 
to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory 
and run the
+'`mvn clean package`' command. You will find a JAR file that contains your 
application (plus connectors
+and libraries that you may have added as dependencies to the application) 
here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the 
application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file 
accordingly so that Flink
+can run the application from the JAR file without additionally specifying the 
main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in 
between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project 
Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR 
when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the 
Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these core API dependencies should have their 
scope set to 
[*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope).
 This means that
+they are needed to compile against, but that they should not be packaged into 
the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that 
the resulting JAR
+becomes excessively large, because it also contains all Flink core 
dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR 
file clash with some of
+your own dependency versions (which is normally avoided through inverted 
classloading).
+
+To correctly package the dependencies into the application JAR, the Flink API 
dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in 
different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies 
without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do 
not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not 
built into the Flink 
+distribution, you can either add them to the classpath of the distribution or 
shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote 
cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+
+To learn more about how to deploy Flink jobs, check out the [deployment 
guide]({{< ref "docs/deployment/cli" >}}).
+
+## Template for creating an uber/fat JAR with dependencies
+
+To build an application JAR that contains all dependencies required for 
declared connectors and libraries,
+you can use the following shade plugin definition:
+
+```xml
+<build>
+    <plugins>
+        <plugin>
+            <groupId>org.apache.maven.plugins</groupId>
+            <artifactId>maven-shade-plugin</artifactId>
+            <version>3.2.4</version>
+            <executions>
+                <execution>
+                    <phase>package</phase>
+                    <goals>
+                        <goal>shade</goal>
+                    </goals>
+                    <configuration>
+                        <artifactSet>
+                            <excludes>
+                                
<exclude>com.google.code.findbugs:jsr305</exclude>
+                                <exclude>org.slf4j:*</exclude>
+                                <exclude>log4j:*</exclude>
+                            </excludes>
+                        </artifactSet>
+                        <filters>
+                            <filter>
+                                <!-- Do not copy the signatures in the 
META-INF folder.
+                                Otherwise, this might cause SecurityExceptions 
when using the JAR. -->
+                                <artifact>*:*</artifact>
+                                <excludes>
+                                    <exclude>META-INF/*.SF</exclude>
+                                    <exclude>META-INF/*.DSA</exclude>
+                                    <exclude>META-INF/*.RSA</exclude>
+                                </excludes>
+                            </filter>
+                        </filters>
+                        <transformers>
+                            <transformer 
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">

Review comment:
       This is missing the `ServicesResourceTransformer`.

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,175 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide 
will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation 
tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can 
use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you 
import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e 
plugin](http://www.eclipse.org/m2e/) 
+to [import Maven 
projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+**Note**: The default JVM heap size for Java may be too small for Flink and 
you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM 
Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | 
Edit Custom VM Options` menu.
+See [this 
article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties)
 for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is 
necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If 
this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is 
to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory 
and run the
+'`mvn clean package`' command. You will find a JAR file that contains your 
application (plus connectors
+and libraries that you may have added as dependencies to the application) 
here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the 
application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file 
accordingly so that Flink
+can run the application from the JAR file without additionally specifying the 
main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in 
between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project 
Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR 
when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the 
Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these core API dependencies should have their 
scope set to 
[*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope).
 This means that
+they are needed to compile against, but that they should not be packaged into 
the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that 
the resulting JAR
+becomes excessively large, because it also contains all Flink core 
dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR 
file clash with some of
+your own dependency versions (which is normally avoided through inverted 
classloading).
+
+To correctly package the dependencies into the application JAR, the Flink API 
dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in 
different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies 
without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do 
not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not 
built into the Flink 
+distribution, you can either add them to the classpath of the distribution or 
shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote 
cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+
+To learn more about how to deploy Flink jobs, check out the [deployment 
guide]({{< ref "docs/deployment/cli" >}}).
+
+## Template for creating an uber/fat JAR with dependencies
+
+To build an application JAR that contains all dependencies required for 
declared connectors and libraries,
+you can use the following shade plugin definition:
+
+```xml
+<build>
+    <plugins>
+        <plugin>
+            <groupId>org.apache.maven.plugins</groupId>
+            <artifactId>maven-shade-plugin</artifactId>
+            <version>3.2.4</version>

Review comment:
       this is different than the version in our quickstarts.

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,175 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide 
will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation 
tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can 
use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you 
import this project into
+your IDE for developing and testing.

Review comment:
       same issue as with the gradle guide.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to