lirui-apache commented on a change in pull request #10681:
[FLINK-14849][hive][doc] Fix documentation about Hive dependencies
URL: https://github.com/apache/flink/pull/10681#discussion_r361352760
##########
File path: docs/dev/table/hive/index.md
##########
@@ -100,147 +102,203 @@ We are using Hive 2.3.4 and 1.2.1 as examples here.
/flink-{{ site.version }}
/lib
- flink-dist{{ site.scala_version_suffix }}-{{ site.version }}.jar
- flink-table{{ site.scala_version_suffix }}-{{ site.version }}.jar
- // we highly recommend using Flink's blink planner with Hive integration
- flink-table-blink{{ site.scala_version_suffix }}-{{ site.version }}.jar
// Flink's Hive connector.Contains flink-hadoop-compatibility and
flink-orc jars
flink-connector-hive{{ site.scala_version_suffix }}-{{ site.version
}}.jar
// Hadoop dependencies
- // Pick the correct Hadoop dependency for your project.
- // Hive 2.3.4 is built with Hadoop 2.7.2. We pick 2.7.5 which
flink-shaded-hadoop is pre-built with,
- // but users can pick their own hadoop version, as long as it's
compatible with Hadoop 2.7.2
+ // You can pick a pre-built Hadoop uber jar provided by Flink,
alternatively
+ // you can use your own hadoop jars. Either way, make sure it's
compatible with your Hadoop
+ // cluster and the Hive version you're using.
flink-shaded-hadoop-2-uber-2.7.5-{{ site.shaded_version }}.jar
// Hive dependencies
hive-exec-2.3.4.jar
- ...
+{% endhighlight %}
+</div>
+
+<div data-lang="Hive 1.0.0" markdown="1">
+{% highlight txt %}
+/flink-{{ site.version }}
+ /lib
+
+ // Flink's Hive connector. Contains flink-hadoop-compatibility and
flink-orc jars
+ flink-connector-hive{{ site.scala_version_suffix }}-{{ site.version
}}.jar
+
+ // Hadoop dependencies
+ // You can pick a pre-built Hadoop uber jar provided by Flink,
alternatively
+ // you can use your own hadoop jars. Either way, make sure it's
compatible with your Hadoop
+ // cluster and the Hive version you're using.
+ flink-shaded-hadoop-2-uber-2.6.5-{{ site.shaded_version }}.jar
+
+ // Hive dependencies
+ hive-metastore-1.0.0.jar
+ hive-exec-1.0.0.jar
+ libfb303-0.9.0.jar
+
+{% endhighlight %}
+</div>
+
+<div data-lang="Hive 1.1.0" markdown="1">
+{% highlight txt %}
+/flink-{{ site.version }}
+ /lib
+
+ // Flink's Hive connector. Contains flink-hadoop-compatibility and
flink-orc jars
+ flink-connector-hive{{ site.scala_version_suffix }}-{{ site.version
}}.jar
+
+ // Hadoop dependencies
+ // You can pick a pre-built Hadoop uber jar provided by Flink,
alternatively
+ // you can use your own hadoop jars. Either way, make sure it's
compatible with your Hadoop
+ // cluster and the Hive version you're using.
+ flink-shaded-hadoop-2-uber-2.6.5-{{ site.shaded_version }}.jar
+
+ // Hive dependencies
+ hive-metastore-1.1.0.jar
+ hive-exec-1.1.0.jar
+ libfb303-0.9.2.jar
+
{% endhighlight %}
</div>
<div data-lang="Hive 1.2.1" markdown="1">
{% highlight txt %}
/flink-{{ site.version }}
/lib
- flink-dist{{ site.scala_version_suffix }}-{{ site.version }}.jar
- flink-table{{ site.scala_version_suffix }}-{{ site.version }}.jar
- // we highly recommend using Flink's blink planner with Hive integration
- flink-table-blink{{ site.scala_version_suffix }}-{{ site.version }}.jar
// Flink's Hive connector. Contains flink-hadoop-compatibility and
flink-orc jars
flink-connector-hive{{ site.scala_version_suffix }}-{{ site.version
}}.jar
// Hadoop dependencies
- // Pick the correct Hadoop dependency for your project.
- // Hive 1.2.1 is built with Hadoop 2.6.0. We pick 2.6.5 which
flink-shaded-hadoop is pre-built with,
- // but users can pick their own hadoop version, as long as it's
compatible with Hadoop 2.6.0
+ // You can pick a pre-built Hadoop uber jar provided by Flink,
alternatively
+ // you can use your own hadoop jars. Either way, make sure it's
compatible with your Hadoop
+ // cluster and the Hive version you're using.
flink-shaded-hadoop-2-uber-2.6.5-{{ site.shaded_version }}.jar
// Hive dependencies
hive-metastore-1.2.1.jar
hive-exec-1.2.1.jar
- libfb303-0.9.3.jar
+ libfb303-0.9.2.jar
- ...
{% endhighlight %}
</div>
+
+<div data-lang="Hive 2.0.0" markdown="1">
+{% highlight txt %}
+/flink-{{ site.version }}
+ /lib
+
+ // Flink's Hive connector. Contains flink-hadoop-compatibility and
flink-orc jars
+ flink-connector-hive{{ site.scala_version_suffix }}-{{ site.version
}}.jar
+
+ // Hadoop dependencies
+ // You can pick a pre-built Hadoop uber jar provided by Flink,
alternatively
+ // you can use your own hadoop jars. Either way, make sure it's
compatible with your Hadoop
+ // cluster and the Hive version you're using.
+ flink-shaded-hadoop-2-uber-2.7.5-{{ site.shaded_version }}.jar
+
+ // Hive dependencies
+ hive-exec-2.0.0.jar
+
+{% endhighlight %}
</div>
+<div data-lang="Hive 2.1.0" markdown="1">
+{% highlight txt %}
+/flink-{{ site.version }}
+ /lib
-Similarly, If you are building your own program, you need the above
dependencies in your mvn file.
+ // Flink's Hive connector. Contains flink-hadoop-compatibility and
flink-orc jars
+ flink-connector-hive{{ site.scala_version_suffix }}-{{ site.version
}}.jar
-<div class="codetabs" markdown="1">
-<div data-lang="Hive 2.3.4" markdown="1">
-{% highlight xml %}
-<dependency>
- <groupId>org.apache.flink</groupId>
- <artifactId>flink-connector-hive{{ site.scala_version_suffix }}</artifactId>
- <version>{{site.version}}</version>
- <scope>provided</scope>
-</dependency>
+ // Hadoop dependencies
+ // You can pick a pre-built Hadoop uber jar provided by Flink,
alternatively
+ // you can use your own hadoop jars. Either way, make sure it's
compatible with your Hadoop
+ // cluster and the Hive version you're using.
+ flink-shaded-hadoop-2-uber-2.7.5-{{ site.shaded_version }}.jar
+
+ // Hive dependencies
+ hive-exec-2.1.0.jar
-<!-- Hadoop Dependencies -->
+{% endhighlight %}
+</div>
-<dependency>
- <groupId>org.apache.flink</groupId>
- <artifactId>flink-hadoop-compatibility{{ site.scala_version_suffix
}}</artifactId>
- <version>{{site.version}}</version>
- <scope>provided</scope>
-</dependency>
+<div data-lang="Hive 2.2.0" markdown="1">
+{% highlight txt %}
+/flink-{{ site.version }}
+ /lib
-<!-- Pick the correct Hadoop dependency for your project.
-Hive 2.3.4 is built with Hadoop 2.7.2. We pick 2.7.5 which flink-shaded-hadoop
is pre-built with,
- but users can pick their own hadoop version, as long as it's compatible with
Hadoop 2.7.2 -->
+ // Flink's Hive connector. Contains flink-hadoop-compatibility and
flink-orc jars
+ flink-connector-hive{{ site.scala_version_suffix }}-{{ site.version
}}.jar
-<dependency>
- <groupId>org.apache.flink</groupId>
- <artifactId>flink-shaded-hadoop-2-uber</artifactId>
- <version>2.7.5-{{ site.shaded_version }}</version>
- <scope>provided</scope>
-</dependency>
+ // Hadoop dependencies
+ // You can pick a pre-built Hadoop uber jar provided by Flink,
alternatively
+ // you can use your own hadoop jars. Either way, make sure it's
compatible with your Hadoop
+ // cluster and the Hive version you're using.
+ flink-shaded-hadoop-2-uber-2.7.5-{{ site.shaded_version }}.jar
+
+ // Hive dependencies
+ hive-exec-2.2.0.jar
+
+ // Orc dependencies -- required by the ORC vectorized optimizations
+ orc-core-1.4.3.jar
+ aircompressor-0.8.jar
-<!-- Hive Dependency -->
-<dependency>
- <groupId>org.apache.hive</groupId>
- <artifactId>hive-exec</artifactId>
- <version>2.3.4</version>
-</dependency>
{% endhighlight %}
</div>
-<div data-lang="Hive 1.2.1" markdown="1">
+<div data-lang="Hive 3.1.0" markdown="1">
+{% highlight txt %}
+/flink-{{ site.version }}
+ /lib
+
+ // Flink's Hive connector. Contains flink-hadoop-compatibility and
flink-orc jars
+ flink-connector-hive{{ site.scala_version_suffix }}-{{ site.version
}}.jar
+
+ // Hadoop dependencies
+ // You can pick a pre-built Hadoop uber jar provided by Flink,
alternatively
+ // you can use your own hadoop jars. Either way, make sure it's
compatible with your Hadoop
+ // cluster and the Hive version you're using.
+ flink-shaded-hadoop-2-uber-2.8.3-{{ site.shaded_version }}.jar
+
+ // Hive dependencies
+ hive-exec-3.1.0.jar
+ libfb303-0.9.3.jar
Review comment:
`libfb303` is either packed into `hive-exec` or introduced as a transitive
dependency of `hive-exec`. So we don't need to declare it in our pom
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services