Kengo Seki created BIGTOP-3695:
----------------------------------
Summary: Define Bigtop 3.2 release BOM
Key: BIGTOP-3695
URL: https://issues.apache.org/jira/browse/BIGTOP-3695
Project: Bigtop
Issue Type: Task
Reporter: Kengo Seki
Let's discuss the component stack of the next release.
I'd like to propose upgrading Hadoop to 3.3.x since we have put it off long
time.
I'd also like to add Ranger to our stack, which is often required in the
projects with high security requirements.
h3. Components
{noformat}
Components in v3.1 in v3.2
alluxio 2.4.1 => 2.8.0 [1]
ambari 2.7.5 => 2.7.6
bigtop-ambari-mpack 2.7.5 2.7.5
bigtop-groovy 2.5.4 2.5.4
bigtop-jsvc 1.2.4 1.2.4
bigtop-utils 3.1.0 3.1.0
elasticsearch 5.6.14 5.6.14
flink 1.11.6 => 1.15.0
gpdb 5.28.5 5.28.5
hadoop 3.2.3 => 3.3.3
hbase 2.4.11 => 2.4.12 [2]
hive 3.1.2 => 3.1.3 [3]
kafka 2.8.1 2.8.1
kibana 5.4.1 5.4.1
livy 0.7.1 0.7.1
logstash 5.4.1 5.4.1
oozie 5.2.1 5.2.1
phoenix 5.1.2 5.1.2
ranger - => 2.2.0 or greater [4]
solr 8.11.1 8.11.1
spark 3.1.2 => 3.2.1 [5]
sqoop 1.4.7 1.4.7
tez 0.10.1 0.10.1
ycsb 0.17.0 0.17.0
zeppelin 0.10.0 => 0.10.1 [6]
zookeeper 3.5.9 3.5.9
{noformat}
h3. Distros
{noformat}
- CentOS 7, Rocky Linux 8
- Debian 10, 11
- Fedora 35
- Ubuntu 18.04, 20.04
{noformat}
h3. Archs
{noformat}
- x86_64
- aarch64
- ppc64le
{noformat}
h3. JDK
{noformat}
JDK8
{noformat}
[1]: According to the following document, Alluxio 2.8.0 is supposed to be
compatible with HDFS 3.3.x.
https://docs.alluxio.io/os/user/2.8.0/en/ufs/HDFS.html#supported-hdfs-versions
[2]: According to the following document, HBase 2.4.12 is supposed to be
compatible with Hadoop 3.3.x.
https://hbase.apache.org/book.html#hadoop
[3]: According to the following document, Hive 3.1.3 is supposed to be
compatible with Hadoop 3.x.y.
https://hive.apache.org/downloads.html
[4]: Ranger 2.2.0, which is the latest release as of now, doesn't support
Hadoop 3.3.x yet.
So we probably have to apply RANGER-3047 with some modification or wait
for the 2.3.0 release.
[5]: We can expect Spark 3.2.1 is compatible with Hadoop 3.3.x, since it's
built for Hadoop 3.3.0 in the following example:
https://spark.apache.org/docs/3.2.1/building-spark.html#specifying-the-hadoop-version-and-enabling-yarn
[6]: Zeppelin 0.10.1, which is the latest release as of now, already has the
"spark-3.2" build profile
(https://github.com/apache/zeppelin/blob/v0.10.1/spark/pom.xml#L172), but
doesn't the Flink shim for 1.15.x yet
(https://github.com/apache/zeppelin/tree/v0.10.1/flink).
Probably we have to apply ZEPPELIN-5600.
--
This message was sent by Atlassian Jira
(v8.20.7#820007)