sqoop job --exec sojob报错:ERROR hive.HiveConfig: Could not load

sqoop job --exec sojob报错

19/07/10 21:15:17 ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
19/07/10 21:15:17 ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
        at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
        at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)
        at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)
        at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
        at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
        at org.apache.sqoop.tool.JobTool.execJob(JobTool.java:243)
        at org.apache.sqoop.tool.JobTool.run(JobTool.java:298)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44)
        ... 14 more

原因:线程“main”java.lang.NoClassDefFoundError中的异常,hive配置文件有问题

Hive的配置:
配置hive-env.sh 添加hadoop_home路径:
将export HADOOP_HOME前面的‘#’号去掉,
并让它指向您所安装hadoop的目录 (就是切换到这个目录下有hadoop的conf,lib,bin 等文件夹的目录),
(mine:HADOOP_HOME=/usr/local/hadoop)
其实在安装hive时需要指定HADOOP_HOME的原理基本上与
在安装Hadoop时需要指定JAVA_HOME的原理是相类似的。
Hadoop需要java作支撑,而hive需要hadoop作为支撑。
将export HIVE_CONF_DIR=/usr/hive/conf,并且把‘#’号去掉
将export HIVE_AUX_JARS_PATH=/usr/hive/lib
esc(键)

:wq

ERROR manager.SqlManager: Error executing statement: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

原因:缺少JDBC驱动,将你的jdbc的jar包导入sqoop/lib目录下

[root@hadoop01 data]# cp mysql-connector-java-5.1.31.jar /opt/app/sqoop/lib/

仍然报错:将你hive/lib目录下的两个jar包放到sqoop/lib目录下

[root@hadoop1 lib]# cp /apps/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar .
[root@hadoop1 lib]# cp /apps/apache-hive-1.2.1-bin/lib/hive-exec-1.2.1.jar .

上一篇:Sqoop:ERROR manager.SqlManager:从数据库读取错误:java.sql.SQLException:


下一篇:sqoop1 import 时报错Exception in thread "main" java.lang.NoClassDefFoundError: org/json/JSONO