【原创】大叔问题定位分享(16)spark写数据到hive外部表报错ClassCastException: org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat cannot be cast to org.apache.hadoop.hive.ql.io.HiveOutputFormat

spark 2.1.1

spark在写数据到hive外部表(底层数据在hbase中)时会报错

Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat cannot be cast to org.apache.hadoop.hive.ql.io.HiveOutputFormat
at org.apache.spark.sql.hive.SparkHiveWriterContainer.outputFormat$lzycompute(hiveWriterContainers.scala:82)

org.apache.spark.sql.hive.SparkHiveWriterContainer

org.apache.spark.sql.hive.SparkHiveWriterContainer
@transient private lazy val outputFormat = conf.value.getOutputFormat.asInstanceOf[HiveOutputFormat[AnyRef, Writable]]

报错的是这一句,查看代码发现此时这个变量并没有什么用处,可以在不能cast时置为null

  @transient private lazy val outputFormat =
// conf.value.getOutputFormat.asInstanceOf[HiveOutputFormat[AnyRef, Writable]]
conf.value.getOutputFormat match {
case format if format.isInstanceOf[HiveOutputFormat[AnyRef, Writable]] => format.asInstanceOf[HiveOutputFormat[AnyRef, Writable]]
case _ => null
}

问题解决,官方讨论如下: https://issues.apache.org/jira/browse/SPARK-6628

上一篇:14个你可能不知道的JavaScript调试技巧


下一篇:前端开发HTML&css入门——伪类选择器和一些特殊的选择器