[
https://issues.apache.org/jira/browse/KYLIN-3068?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16273071#comment-16273071
]
Vsevolod Ostapenko commented on KYLIN-3068:
-------------------------------------------
I don't mind trying to contribute back to the project, but I find existing
instructions on setting up dev environment somewhat outdated.
Since Kylin 2.2.x now pulls and packages Spark 2.1.1 as part of the
distribution tar, do I really need to have spark 1.6.x configured to run the
tests?
Is there an updated/yet unpublished dev env setup instruction?
> HiveColumnCardinalityJob.java is using deprecated parameter name for HDFS
> block size
> ------------------------------------------------------------------------------------
>
> Key: KYLIN-3068
> URL: https://issues.apache.org/jira/browse/KYLIN-3068
> Project: Kylin
> Issue Type: Bug
> Components: Job Engine
> Affects Versions: v2.2.0
> Environment: HDP 2.5.3, Kylin 2.2.0
> Reporter: Vsevolod Ostapenko
> Assignee: Dong Li
> Priority: Minor
> Original Estimate: 24h
> Remaining Estimate: 24h
>
> While setting MR job configuration HiveColumnCardinalityJob.java uses
> deprecated parameter name for HDFS block size.
> Since Hadoop 2.5 (at least) recommended parameter name is dfs.blocksize,
> while the existing code (2.2.0 and earlier) uses dfs.block.size (please refer
> to the list of Hadoop deprecated parameters -
> http://hadoop.apache.org/docs/r2.7.3/hadoop-project-dist/hadoop-common/DeprecatedProperties.html)
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)