>>提君博客原创 http://www.cnblogs.com/tijun/ <<
刚刚安装好hive,进行第一次启动
[hadoop@ltt1 bin]$ ./hive
ls: cannot access /home/hadoop/spark-2.2.-bin-hadoop2./lib/spark-assembly-*.jar: No such file or directory
which: no hbase in (/home/hadoop/hive110/bin:/home/hadoop/spark-2.2.-bin-hadoop2./bin:/home/hadoop/scala-2.11./bin:/home/hadoop/protobuf250/bin:/home/hadoop/hadoop260/bin:/home/hadoop/zookeeper345/bin:/home/hadoop/maven339/bin:/home/hadoop/jdk1..0_144/bin:/home/hadoop/spark-2.2.-bin-hadoop2./bin:/home/hadoop/scala-2.11./bin:/home/hadoop/protobuf250/bin:/home/hadoop/hadoop260/bin:/home/hadoop/zookeeper345/bin:/home/hadoop/maven339/bin:/home/hadoop/jdk1..0_144/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/home/hadoop/bin) Logging initialized using configuration in file:/home/hadoop/hive110/conf/hive-log4j.properties
WARNING: Hive CLI is deprecated and migration to Beeline is recommended.
hive (default)>
会出现一个提示
ls: cannot access /home/hadoop/spark-2.2.0-bin-hadoop2.6/lib/spark-assembly-*.jar: No such file or directory
出现这个问题的原因是
spark升级到spark2以后,原有lib目录下的大JAR包被分散成多个小JAR包,原来的spark-assembly-*.jar已经不存在,所以hive没有办法找到这个JAR包。
>>提君博客原创 http://www.cnblogs.com/tijun/ <<
解决方法
打开hive的安装目录下的bin目录,找到hive文件
cd $HIVE_HOME/bin
vi hive
找到下图中的位置
将鼠标定位的位置,更改成下图
>>提君博客原创 http://www.cnblogs.com/tijun/ <<
这样问题就解决了。
>>提君博客原创 http://www.cnblogs.com/tijun/ <<