Have you seen this thread ?

http://search-hadoop.com/m/q3RTtCoKmv14Hd1H1&subj=Re+Spark+Hive+max+key+length+is+767+bytes

On Thu, Nov 26, 2015 at 5:26 AM, <luohui20...@sina.com> wrote:

> hi guys,
>
>      when I am trying to connect hive with spark-sql,I got a problem like
> below:
>
>
> [root@master spark]# bin/spark-shell --master local[4]
>
> log4j:WARN No appenders could be found for logger
> (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
>
> log4j:WARN Please initialize the log4j system properly.
>
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
> more info.
>
> Using Spark's repl log4j profile:
> org/apache/spark/log4j-defaults-repl.properties
>
> To adjust logging level use sc.setLogLevel("INFO")
>
> Welcome to
>
>       ____              __
>
>      / __/__  ___ _____/ /__
>
>     _\ \/ _ \/ _ `/ __/  '_/
>
>    /___/ .__/\_,_/_/ /_/\_\   version 1.5.2
>
>       /_/
>
>
> Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java
> 1.8.0_65)
>
> Type in expressions to have them evaluated.
>
> Type :help for more information.
>
> 15/11/26 21:14:35 WARN Utils: Your hostname, master resolves to a loopback
> address: 127.0.1.1; using 10.60.162.236 instead (on interface eth1)
>
> 15/11/26 21:14:35 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
> another address
>
> 15/11/26 21:14:35 WARN SparkConf:
>
> SPARK_CLASSPATH was detected (set to
> ':/usr/lib/spark/lib/mysql-connector-java-5.1.21-bin.jar').
>
> This is deprecated in Spark 1.0+.
>
>
> Please instead use:
>
>  - ./spark-submit with --driver-class-path to augment the driver classpath
>
>  - spark.executor.extraClassPath to augment the executor classpath
>
>
> 15/11/26 21:14:35 WARN SparkConf: Setting 'spark.executor.extraClassPath'
> to ':/usr/lib/spark/lib/mysql-connector-java-5.1.21-bin.jar' as a
> work-around.
>
> 15/11/26 21:14:35 WARN SparkConf: Setting 'spark.driver.extraClassPath' to
> ':/usr/lib/spark/lib/mysql-connector-java-5.1.21-bin.jar' as a work-around.
>
> 15/11/26 21:14:36 WARN MetricsSystem: Using default name DAGScheduler for
> source because spark.app.id is not set.
>
> Spark context available as sc.
>
> 15/11/26 21:14:38 WARN Connection: BoneCP specified but not present in
> CLASSPATH (or one of dependencies)
>
> 15/11/26 21:14:39 WARN Connection: BoneCP specified but not present in
> CLASSPATH (or one of dependencies)
>
> 15/11/26 21:14:44 WARN ObjectStore: Version information not found in
> metastore. hive.metastore.schema.verification is not enabled so recording
> the schema version 1.2.0
>
> 15/11/26 21:14:44 WARN ObjectStore: Failed to get database default,
> returning NoSuchObjectException
>
> 15/11/26 21:14:46 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
>
> 15/11/26 21:14:46 WARN Connection: BoneCP specified but not present in
> CLASSPATH (or one of dependencies)
>
> 15/11/26 21:14:46 WARN Connection: BoneCP specified but not present in
> CLASSPATH (or one of dependencies)
>
> 15/11/26 21:14:48 ERROR Datastore: Error thrown executing CREATE TABLE
> `PARTITION_PARAMS`
>
> (
>
>     `PART_ID` BIGINT NOT NULL,
>
>     `PARAM_KEY` VARCHAR(256) BINARY NOT NULL,
>
>     `PARAM_VALUE` VARCHAR(4000) BINARY NULL,
>
>     CONSTRAINT `PARTITION_PARAMS_PK` PRIMARY KEY (`PART_ID`,`PARAM_KEY`)
>
> ) ENGINE=INNODB : Specified key was too long; max key length is 767 bytes
>
> com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key
> was too long; max key length is 767 bytes
>
>
>
> you may access full log file <log> in the compressed file. And the
> conf/hive-site.xml is also attached here.
>
>
> I tried to modify mysql charcter set to latin1 and utf8,both doesn't work.
> Even this exception may occur when loading data to my table.
>
>
> Any idea will be appreciated.
>
>
>
> --------------------------------
>
> Thanks&amp;Best regards!
> San.Luo
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

Reply via email to