No expert about Snappy and CDH. Below doc describes the compression settings in Kylin where you can tweak to disable compress or use a different compression than Snappy.
http://kylin.apache.org/docs/install/advance_settings.html On Tue, Jan 12, 2016 at 7:36 PM, 赵磊 <[email protected]> wrote: > hi all: > I'm beginer of kylin ,when I test cube,I meet the problem: > the sql is "SET mapreduce.job.split.metainfo.maxsize=-1; > SET mapred.compress.map.output=true; > SET > mapred.map.output.compression.codec=org.apache.hadoop.io.compress.SnappyCodec; > SET mapred.output.compress=true; > SET > mapred.output.compression.codec=org.apache.hadoop.io.compress.SnappyCodec; > SET mapred.output.compression.type=BLOCK; > SET mapreduce.job.max.split.locations=2000; > SET dfs.replication=2; > SET hive.merge.mapfiles=true; > SET hive.merge.mapredfiles=true; > SET hive.merge.size.per.task=268435456; > SET hive.support.concurrency=false; > SET hive.exec.compress.output=true; > SET hive.auto.convert.join.noconditionaltask = true; > SET hive.auto.convert.join.noconditionaltask.size = 300000000; > INSERT OVERWRITE TABLE > kylin_intermediate_kylin_sales_cube_desc_19700101000000_20160101000000_569b67ec_0762_4bdb_b333_f8e0dc1f92b9 > S > ELECT > KYLIN_SALES.PART_DT > ,KYLIN_SALES.LEAF_CATEG_ID > ,KYLIN_SALES.LSTG_SITE_ID > ,KYLIN_CATEGORY_GROUPINGS.META_CATEG_NAME > ,KYLIN_CATEGORY_GROUPINGS.CATEG_LVL2_NAME > ,KYLIN_CATEGORY_GROUPINGS.CATEG_LVL3_NAME > ,KYLIN_SALES.LSTG_FORMAT_NAME > ,KYLIN_SALES.PRICE > ,KYLIN_SALES.SELLER_ID > FROM DEFAULT.KYLIN_SALES as KYLIN_SALES > INNER JOIN DEFAULT.KYLIN_CAL_DT as KYLIN_CAL_DT > ON KYLIN_SALES.PART_DT = KYLIN_CAL_DT.CAL_DT > INNER JOIN DEFAULT.KYLIN_CATEGORY_GROUPINGS as KYLIN_CATEGORY_GROUPINGS > ON KYLIN_SALES.LEAF_CATEG_ID = KYLIN_CATEGORY_GROUPINGS.LEAF_CATEG_ID AND > KYLIN_SALES.LSTG_SITE_ID = KYLIN_CATEGORY_GROUPINGS.SITE_I > D > WHERE (KYLIN_SALES.PART_DT < '2016-01-01') > ;" > I check some web , the problem is snappy. > > *hadoop checknative -a | grep snappy* > *snappy: false * > > our platform is CDH5.6 ,and I finish the snappy things: > */usr/lib64:* > lrwxrwxrwx. 1 root root 18 Jan 8 14:01 libsnappy.so -> > libsnappy.so.1.1.4 > lrwxrwxrwx. 1 root root 18 Jul 20 20:41 libsnappy.so.1 -> > libsnappy.so.1.1.4 > -rwxr-xr-x. 1 root root 22392 Nov 23 2013 libsnappy.so.1.1.4 > > /*home/hadoop/hadoop-2.6.0-cdh5.4.5/lib:* > -rw-r--r--. 1 root root 11785 Jan 11 11:57 > hadoop-snappy-0.0.1-SNAPSHOT.jar > > */home/hadoop/hadoop-2.6.0-cdh5.4.5/lib/native/Linux-amd64-64:* > lrwxrwxrwx. 1 hadoop hadoop 24 Jan 11 15:11 libhadoopsnappy.so -> > libhadoopsnappy.so.0.0.1 > lrwxrwxrwx. 1 hadoop hadoop 24 Jan 11 15:11 libhadoopsnappy.so.0 -> > libhadoopsnappy.so.0.0.1 > -rwxr-xr-x. 1 hadoop hadoop 54437 Jan 11 15:11 libhadoopsnappy.so.0.0.1 > lrwxrwxrwx. 1 hadoop hadoop 18 Jan 11 15:12 libsnappy.so -> > libsnappy.so.1.1.4 > lrwxrwxrwx. 1 hadoop hadoop 18 Jan 11 15:12 libsnappy.so.1 -> > libsnappy.so.1.1.4 > -rwxr-xr-x. 1 hadoop hadoop 22392 Jan 11 15:12 libsnappy.so.1.1.4 > > */home/hadoop/hadoop-2.6.0-cdh5.4.5/etc/hadoop/hadoop-env.sh:* > export > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/ > export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native" > export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_HOME}/lib/native > > How can I solve the problem ? Thx! > > > > >
