Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

  

  不多说,直接上干货!

    

  这个问题,得非 你的hive和hbase是不是同样都是CDH版本,还是一个是apache版本,一个是CDH版本。

  问题详情

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

[kfk@bigdata-pro01 apache-hive-1.0.-bin]$ bin/hive

Logging initialized using configuration in file:/opt/modules/apache-hive-1.0.-bin/conf/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/modules/hadoop-2.6./share/hadoop/common/lib/slf4j-log4j12-1.7..jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/modules/apache-hive-1.0.-bin/lib/hive-jdbc-1.0.-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
hive> create external table weblogs(id string,datatime string,userid string,searchname string,retorder string,cliorder string,cliurl string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES("hbase.columns.mapping" = ":key,info:datatime,info:userid,info:searchname,info:retorder,info:cliorder,info:cliurl") TBLPROPERTIES("hbase.table.name" = "weblogs");
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:MetaException(message:HBase table weblogs doesn't exist while the table is declared as an external table.)
at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:)
at java.lang.reflect.Method.invoke(Method.java:)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:)
at com.sun.proxy.$Proxy7.createTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:)
at java.lang.reflect.Method.invoke(Method.java:)
at org.apache.hadoop.util.RunJar.run(RunJar.java:)
at org.apache.hadoop.util.RunJar.main(RunJar.java:)
)

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

    说明没创建成功。

 Hive和HBase同是CDH版本的解决办法

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

  首先通过用下面的命令,重新启动hive
  ./hive -hiveconf hive.root.logger=DEBUG,console 进行debug
  查看到错误原因

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

  调遣hbase包错误,配置文件不能加载。

  将hbase目录下jar包拷贝到hadoop/lib下,(注意了我这里为了保险起见3个节点都做了这一步)

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

[kfk@bigdata-pro01 lib]$ cp  hbase-client-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-common-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-common-0.98.-cdh5.3.0-tests.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-examples-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-hadoop2-compat-0.98.-cdh5.3.0.jar /opt/modules/h
hadoop-2.6./ hbase-0.98.-cdh5.3.0/ hive-0.13.-cdh5.3.0/
[kfk@bigdata-pro01 lib]$ cp hbase-hadoop2-compat-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-hadoop2-compat-0.98.-cdh5.3.0-tests.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-hadoop-compat-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-hadoop-compat-0.98.-cdh5.3.0-tests.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-it-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-it-0.98.-cdh5.3.0-tests.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-prefix-tree-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-protocol-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-server-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-server-0.98.-cdh5.3.0-tests.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-shell-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-testing-util-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ cp hbase-thrift-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro01 lib]$ pwd
/opt/modules/hbase-0.98.-cdh5.3.0/lib
[kfk@bigdata-pro01 lib]$

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

[kfk@bigdata-pro02 lib]$ cp  hbase-client-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-common-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-common-0.98.-cdh5.3.0-tests.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-examples-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-hadoop2-compat-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-hadoop2-compat-0.98.-cdh5.3.0-tests.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-hadoop-compat-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-hadoop-compat-0.98.-cdh5.3.0-tests.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-it-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-it-0.98.-cdh5.3.0-tests.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-prefix-tree-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-protocol-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-server-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-server-0.98.-cdh5.3.0-tests.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-shell-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$
[kfk@bigdata-pro02 lib]$ cp hbase-testing-util-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-thrift-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ cp hbase-testing-util-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro02 lib]$ pwd
/opt/modules/hbase-0.98.-cdh5.3.0/lib
[kfk@bigdata-pro02 lib]$

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

[kfk@bigdata-pro03 ~]$ cd /opt/modules/hbase-0.98.-cdh5.3.0/lib/
[kfk@bigdata-pro03 lib]$ cp hbase-client-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro03 lib]$ cp hbase-common-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro03 lib]$ cp hbase-common-0.98.-cdh5.3.0-tests.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro03 lib]$ cp hbase-examples-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro03 lib]$ cp hbase-hadoop2-compat-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro03 lib]$ cp hbase-hadoop2-compat-0.98.-cdh5.3.0-tests.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro03 lib]$ cp hbase-hadoop-compat-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro03 lib]$ cp hbase-hadoop-compat-0.98.-cdh5.3.0-tests.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro03 lib]$ cp hbase-it-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro03 lib]$ cp hbase-it-0.98.-cdh5.3.0-tests.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro03 lib]$ cp hbase-prefix-tree-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro03 lib]$ cp hbase-protocol-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro03 lib]$ cp hbase-server-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro03 lib]$ cp hbase-server-0.98.-cdh5.3.0-tests.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro03 lib]$ cp hbase-shell-0.98.-cdh5.3.0.jar /opt/modules/hadoop-2.6./lib
[kfk@bigdata-pro03 lib]$ pwd
/opt/modules/hbase-0.98.-cdh5.3.0/lib
[kfk@bigdata-pro03 lib]$

  然后,修改hive的配置文件,使hbase的lib包生效

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

  重启一下hive,再次建表

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

[kfk@bigdata-pro01 hive-0.13.-cdh5.3.0]$ bin/hive

Logging initialized using configuration in file:/opt/modules/hive-0.13.-cdh5.3.0/conf/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/modules/hadoop-2.6./share/hadoop/common/lib/slf4j-log4j12-1.7..jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/modules/hbase-0.98.-cdh5.3.0/lib/slf4j-log4j12-1.7..jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
hive (default)> create external table weblogs(
> id string,
> datatime string,
> userid string,
> searchname string,
> retorder string,
> cliorder string,
> cliurl string
> )
> STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
> WITH SERDEPROPERTIES("hbase.columns.mapping" = ":key,info:datatime,info:userid,info:searchname,info:retorder,info:cliorder,info:cliurl")
> TBLPROPERTIES("hbase.table.name" = "weblogs");
OK
Time taken: 2.084 seconds
hive (default)>

 Hive和HBase不同是CDH版本的解决办法

  (1) 退出,重新进hive

    (2) 重启Hbase服务试试

   (3)别光检查你的Hive,还看看你的HBase进程啊。

    从时间同步,防火墙,MySQL启动没、hive-site.xml配置好了没,以及授权是否正确等方面去排查。

java.lang.RuntimeException: HRegionServer Aborted的问题

分布式集群HBase启动后某节点的HRegionServer自动消失问题

    (4) 少包了,是Hive和HBase集成这过程出错了。

  官网 https://cwiki.apache.org/confluence/display/Hive/HBaseIntegrationHive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

  

   一闪一闪就是软连接有错误,资源找不到,

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

  

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

export HBASE_HOME=/opt/modules/hbase-0.98.-cdh5.3.0
export HIVE_HOME=/opt/modules/apache-hive-1.0.-bin ln -s $HBASE_HOME/lib/hbase-server-0.98.-cdh5.3.0.jar $HIVE_HOME/lib/hbase-server-0.98.-cdh5.3.0.jar
ln -s $HBASE_HOME/lib/hbase-client-0.98.-cdh5.3.0.jar $HIVE_HOME/lib/hbase-client-0.98.-cdh5.3.0.jar
ln -s $HBASE_HOME/lib/hbase-protocol-0.98.-cdh5.3.0.jar $HIVE_HOME/lib/hbase-protocol-0.98.-cdh5.3.0.jar
ln -s $HBASE_HOME/lib/hbase-it-0.98.-cdh5.3.0.jar $HIVE_HOME/lib/hbase-it-0.98.-cdh5.3.0.jar
ln -s $HBASE_HOME/lib/htrace-core-2.04.jar $HIVE_HOME/lib/htrace-core-2.04.jar
ln -s $HBASE_HOME/lib/hbase-hadoop2-compat-0.98.-cdh5.3.0.jar $HIVE_HOME/lib/hbase-hadoop2-compat-0.98.-cdh5.3.0.jar
ln -s $HBASE_HOME/lib/hbase-hadoop-compat-0.98.-cdh5.3.0.jar $HIVE_HOME/lib/hbase-hadoop-compat-0.98.-cdh5.3.0.jar
ln -s $HBASE_HOME/lib/high-scale-lib-1.1..jar $HIVE_HOME/lib/high-scale-lib-1.1..jar
ln -s $HBASE_HOME/lib/hbase-common-0.98.-cdh5.3.0.jar $HIVE_HOME/lib/hbase-common-0.98.-cdh5.3.0.jar

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

[kfk@bigdata-pro01 lib]$ ll
total
-rw-rw-r-- kfk kfk Jan : accumulo-core-1.6..jar
-rw-rw-r-- kfk kfk Jan : accumulo-fate-1.6..jar
-rw-rw-r-- kfk kfk Jan : accumulo-start-1.6..jar
-rw-rw-r-- kfk kfk Jan : accumulo-trace-1.6..jar
-rw-rw-r-- kfk kfk Jan : activation-1.1.jar
-rw-rw-r-- kfk kfk Jan : ant-1.9..jar
-rw-rw-r-- kfk kfk Jan : ant-launcher-1.9..jar
-rw-rw-r-- kfk kfk Jan : antlr-2.7..jar
-rw-rw-r-- kfk kfk Jan : antlr-runtime-3.4.jar
-rw-rw-r-- kfk kfk Jan : asm-commons-3.1.jar
-rw-rw-r-- kfk kfk Jan : asm-tree-3.1.jar
-rw-rw-r-- kfk kfk Jan : avro-1.7..jar
-rw-rw-r-- kfk kfk Jan : bonecp-0.8..RELEASE.jar
-rw-rw-r-- kfk kfk Jan : calcite-avatica-0.9.-incubating.jar
-rw-rw-r-- kfk kfk Jan : calcite-core-0.9.-incubating.jar
-rw-rw-r-- kfk kfk Jan : commons-beanutils-1.7..jar
-rw-rw-r-- kfk kfk Jan : commons-beanutils-core-1.8..jar
-rw-rw-r-- kfk kfk Jan : commons-cli-1.2.jar
-rw-rw-r-- kfk kfk Jan : commons-codec-1.4.jar
-rw-rw-r-- kfk kfk Jan : commons-collections-3.2..jar
-rw-rw-r-- kfk kfk Jan : commons-compiler-2.7..jar
-rw-rw-r-- kfk kfk Jan : commons-compress-1.4..jar
-rw-rw-r-- kfk kfk Jan : commons-configuration-1.6.jar
-rw-rw-r-- kfk kfk Jan : commons-dbcp-1.4.jar
-rw-rw-r-- kfk kfk Jan : commons-digester-1.8.jar
-rw-rw-r-- kfk kfk Jan : commons-httpclient-3.0..jar
-rw-rw-r-- kfk kfk Jan : commons-io-2.4.jar
-rw-rw-r-- kfk kfk Jan : commons-lang-2.6.jar
-rw-rw-r-- kfk kfk Jan : commons-logging-1.1..jar
-rw-rw-r-- kfk kfk Jan : commons-math-2.1.jar
-rw-rw-r-- kfk kfk Jan : commons-pool-1.5..jar
-rw-rw-r-- kfk kfk Jan : commons-vfs2-2.0.jar
-rw-rw-r-- kfk kfk Jan : curator-client-2.6..jar
-rw-rw-r-- kfk kfk Jan : curator-framework-2.6..jar
-rw-rw-r-- kfk kfk Jan : datanucleus-api-jdo-3.2..jar
-rw-rw-r-- kfk kfk Jan : datanucleus-core-3.2..jar
-rw-rw-r-- kfk kfk Jan : datanucleus-rdbms-3.2..jar
-rw-rw-r-- kfk kfk Jan : derby-10.10.1.1.jar
-rw-rw-r-- kfk kfk Jan : eigenbase-properties-1.1..jar
-rw-rw-r-- kfk kfk Jan : geronimo-annotation_1.0_spec-1.1..jar
-rw-rw-r-- kfk kfk Jan : geronimo-jaspic_1.0_spec-1.0.jar
-rw-rw-r-- kfk kfk Jan : geronimo-jta_1.1_spec-1.1..jar
-rw-rw-r-- kfk kfk Jan : groovy-all-2.1..jar
-rw-rw-r-- kfk kfk Jan : guava-11.0..jar
-rw-rw-r-- kfk kfk Jan : hamcrest-core-1.1.jar
lrwxrwxrwx kfk kfk Jan : hbase-client-0.98.-cdh5.3.0.jar -> /opt/modules/hbase-0.98.-cdh5.3.0/lib/hbase-client-0.98.-cdh5.3.0.jar
lrwxrwxrwx kfk kfk Jan : hbase-common-0.98.-cdh5.3.0.jar -> /opt/modules/hbase-0.98.-cdh5.3.0/lib/hbase-common-0.98.-cdh5.3.0.jar
lrwxrwxrwx kfk kfk Jan : hbase-hadoop2-compat-0.98.-cdh5.3.0.jar -> /opt/modules/hbase-0.98.-cdh5.3.0/lib/hbase-hadoop2-compat-0.98.-cdh5.3.0.jar
lrwxrwxrwx kfk kfk Jan : hbase-hadoop-compat-0.98.-cdh5.3.0.jar -> /opt/modules/hbase-0.98.-cdh5.3.0/lib/hbase-hadoop-compat-0.98.-cdh5.3.0.jar
lrwxrwxrwx kfk kfk Jan : hbase-it-0.98.-cdh5.3.0.jar -> /opt/modules/hbase-0.98.-cdh5.3.0/lib/hbase-it-0.98.-cdh5.3.0.jar
lrwxrwxrwx kfk kfk Jan : hbase-protocol-0.98.-cdh5.3.0.jar -> /opt/modules/hbase-0.98.-cdh5.3.0/lib/hbase-protocol-0.98.-cdh5.3.0.jar
lrwxrwxrwx kfk kfk Jan : hbase-server-0.98.-cdh5.3.0.jar -> /opt/modules/hbase-0.98.-cdh5.3.0/lib/hbase-server-0.98.-cdh5.3.0.jar
lrwxrwxrwx kfk kfk Jan : high-scale-lib-1.1..jar -> /opt/modules/hbase-0.98.-cdh5.3.0/lib/high-scale-lib-1.1..jar
-rw-rw-r-- kfk kfk Jan : hive-accumulo-handler-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-ant-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-beeline-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-cli-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-common-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-contrib-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-exec-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-hbase-handler-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-hwi-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-jdbc-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-jdbc-1.0.-standalone.jar
-rw-rw-r-- kfk kfk Jan : hive-metastore-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-serde-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-service-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-shims-0.20-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-shims-.20S-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-shims-0.23-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-shims-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-shims-common-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-shims-common-secure-1.0..jar
-rw-rw-r-- kfk kfk Jan : hive-testutils-1.0..jar
lrwxrwxrwx kfk kfk Jan : htrace-core-2.04.jar -> /opt/modules/hbase-0.98.-cdh5.3.0/lib/htrace-core-2.04.jar
-rw-rw-r-- kfk kfk Jan : httpclient-4.2..jar
-rw-rw-r-- kfk kfk Jan : httpcore-4.2..jar
-rw-rw-r-- kfk kfk Jan : janino-2.7..jar
-rw-rw-r-- kfk kfk Jan : jansi-1.11.jar
-rw-rw-r-- kfk kfk Jan : jcommander-1.32.jar
-rw-rw-r-- kfk kfk Jan : jdo-api-3.0..jar
-rw-rw-r-- kfk kfk Jan : jetty-all-7.6..v20120127.jar
-rw-rw-r-- kfk kfk Jan : jetty-all-server-7.6..v20120127.jar
-rw-rw-r-- kfk kfk Jan : jline-0.9..jar
-rw-rw-r-- kfk kfk Jan : jpam-1.1.jar
-rw-rw-r-- kfk kfk Jan : jsr305-1.3..jar
-rw-rw-r-- kfk kfk Jan : jta-1.1.jar
-rw-rw-r-- kfk kfk Jan : junit-4.11.jar
-rw-rw-r-- kfk kfk Jan : libfb303-0.9..jar
-rw-rw-r-- kfk kfk Jan : libthrift-0.9..jar
-rw-rw-r-- kfk kfk Jan : linq4j-0.4.jar
-rw-rw-r-- kfk kfk Jan : log4j-1.2..jar
-rw-rw-r-- kfk kfk Jan : mail-1.4..jar
-rw-rw-r-- kfk kfk Jan : maven-scm-api-1.4.jar
-rw-rw-r-- kfk kfk Jan : maven-scm-provider-svn-commons-1.4.jar
-rw-rw-r-- kfk kfk Jan : maven-scm-provider-svnexe-1.4.jar
-rw-rw-r-- kfk kfk Jan : mysql-connector-java-5.1..jar
-rw-rw-r-- kfk kfk Jan : opencsv-2.3.jar
-rw-rw-r-- kfk kfk Jan : oro-2.0..jar
-rw-rw-r-- kfk kfk Jan : paranamer-2.3.jar
-rw-rw-r-- kfk kfk Jan : pentaho-aggdesigner-algorithm-5.1.-jhyde.jar
drwxrwxr-x kfk kfk Jan : php
-rw-rw-r-- kfk kfk Jan : plexus-utils-1.5..jar
drwxrwxr-x kfk kfk Jan : py
-rw-rw-r-- kfk kfk Jan : quidem-0.1..jar
-rw-rw-r-- kfk kfk Jan : regexp-1.3.jar
-rw-rw-r-- kfk kfk Jan : servlet-api-2.5.jar
-rw-rw-r-- kfk kfk Jan : snappy-java-1.0..jar
-rw-rw-r-- kfk kfk Jan : ST4-4.0..jar
-rw-rw-r-- kfk kfk Jan : stax-api-1.0..jar
-rw-rw-r-- kfk kfk Jan : stringtemplate-3.2..jar
-rw-rw-r-- kfk kfk Jan : super-csv-2.2..jar
-rw-rw-r-- kfk kfk Jan : tempus-fugit-1.1.jar
-rw-rw-r-- kfk kfk Jan : velocity-1.5.jar
-rw-rw-r-- kfk kfk Jan : xz-1.0.jar
-rw-rw-r-- kfk kfk Jan : zookeeper-3.4..jar
[kfk@bigdata-pro01 lib]$
    (4)如果你遇到更多的Hive问题,见如下,也许能帮助到你

  然后,再来执行

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

[kfk@bigdata-pro01 apache-hive-1.0.-bin]$ bin/hive

Logging initialized using configuration in file:/opt/modules/apache-hive-1.0.-bin/conf/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/modules/hadoop-2.6./share/hadoop/common/lib/slf4j-log4j12-1.7..jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/modules/apache-hive-1.0.-bin/lib/hive-jdbc-1.0.-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
hive> create external table weblogs(
> id string,
> datatime string,
> userid string,
> searchname string,
> retorder string,
> cliorder string,
> cliurl string
> )
> STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
> WITH SERDEPROPERTIES("hbase.columns.mapping" = ":key,info:datatime,info:userid,info:searchname,info:retorder,info:cliorder,info:cliurl")
> TBLPROPERTIES("hbase.table.name" = "weblogs");

  或者

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

[kfk@bigdata-pro01 apache-hive-1.0.-bin]$ bin/hive

Logging initialized using configuration in file:/opt/modules/apache-hive-1.0.-bin/conf/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/modules/hadoop-2.6./share/hadoop/common/lib/slf4j-log4j12-1.7..jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/modules/apache-hive-1.0.-bin/lib/hive-jdbc-1.0.-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
hive> create external table weblogs(id string,datatime string,userid string,searchname string,retorder string,cliorder string,cliurl string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES("hbase.columns.mapping" = ":key,info:datatime,info:userid,info:searchname,info:retorder,info:cliorder,info:cliurl") TBLPROPERTIES("hbase.table.name" = "weblogs");

    以上是创建与HBase集成的Hive的外部表。

     你创建这个只是告诉hive 去读Hbase的表。
     你Hbase对应的表的结构要和你创建的一样。
     你flume采集的数据有没有进入到hbase到呀。
 
 
欢迎大家,加入我的微信公众号:大数据躺过的坑        人工智能躺过的坑
 
 
 

同时,大家可以关注我的个人博客

   http://www.cnblogs.com/zlslch/   和     http://www.cnblogs.com/lchzls/      http://www.cnblogs.com/sunnyDream/   

   详情请见:http://www.cnblogs.com/zlslch/p/7473861.html

  人生苦短,我愿分享。本公众号将秉持活到老学到老学习无休止的交流分享开源精神,汇聚于互联网和个人学习工作的精华干货知识,一切来于互联网,反馈回互联网。
  目前研究领域:大数据、机器学习、深度学习、人工智能、数据挖掘、数据分析。 语言涉及:Java、Scala、Python、Shell、Linux等 。同时还涉及平常所使用的手机、电脑和互联网上的使用技巧、问题和实用软件。 只要你一直关注和呆在群里,每天必须有收获

对应本平台的讨论和答疑QQ群:大数据和人工智能躺过的坑(总群)(161156071)Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解) 

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

Hive的Shell里hive> 执行操作时,出现FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask错误的解决办法(图文详解)

上一篇:Django 在test.py 中测试文件的配置


下一篇:KEIL中头文件使用配置向导