spark集成hive遭遇mysql check失败的问题

问题:

spark集成hive,启动spark-shell或者spark-sql的时候,报错:

INFO MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql:
Lexical error at line 1, column 5. Encountered: "@" (64), after : "".

环境:

spark-1.4

hive-1.2.1

mysql-5.1

jdbc驱动

原因:

查看hive的源码MetaStoreDirectSql的构造方法:

  public MetaStoreDirectSql(PersistenceManager pm) {
this.pm = pm;
Transaction tx = pm.currentTransaction();
tx.begin();
boolean isMySql = false;
try {
trySetAnsiQuotesForMysql();
isMySql = true;
} catch (SQLException sqlEx) {
LOG.info("MySQL check failed, assuming we are not on mysql: " + sqlEx.getMessage());
tx.rollback();
tx = pm.currentTransaction();
tx.begin();
}

其中调用MetaStoreDirectSql.trySetAnsiQuotesForMysql(),该代码会设置sql_mode:

SET @@session.sql_mode=ANSI_QUOTES

mysql的JDBC驱动执行execute(sql)的时候,会先检查该语句,然后报Encountered: “@” (64)错。

解决:

(目前还没找到好的解决方法)

上一篇:Flink中的CEP复杂事件处理 (源码分析)


下一篇:“亚麻加班,谷歌养老”, 同样是码农差距也太大了!