Apache Hive处理数据示例

继上一篇文章介绍如何使用Pig处理HDFS上的数据,本文将介绍使用Apache Hive进行数据查询和处理。

Apache Hive简介

  • 首先Hive是一款数据仓库软件
  • 使用HiveQL来结构化和查询存放的数据
  • 执行环境:MapReduce, Tez, Spark
  • 数据存放:HDFS, HBase
  • 使用场景:数据挖掘和分析,机器学习,即席查询等

Hive使用示例

  • 还是使用passwd作为操作文件
beeline> !quit
[cloudera@quickstart ~]$ hdfs dfs -put /etc/passwd /tmp/
[cloudera@quickstart ~]$ hdfs dfs -ls /tmp/
Found 5 items
drwxrwxrwt - mapred mapred 0 2016-12-29 01:05 /tmp/hadoop-yarn
drwx-wx-wx - hive supergroup 0 2016-08-27 10:19 /tmp/hive
drwxrwxrwt - mapred hadoop 0 2016-08-10 14:37 /tmp/logs
-rw-r--r-- 1 cloudera supergroup 2559 2017-02-22 05:34 /tmp/passwd
  • 使用beeline连接Hive
[cloudera@quickstart ~]$ beeline -u jdbc:hive2://
scan complete in 24ms
Connecting to jdbc:hive2://
Connected to: Apache Hive (version 1.1.0-cdh5.8.0)
Driver: Hive JDBC (version 1.1.0-cdh5.8.0)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.1.0-cdh5.8.0 by Apache Hive
0: jdbc:hive2://>
  • 建表并且插入数据

0: jdbc:hive2://> CREATE TABLE userinfo ( uname STRING, pswd STRING, uid INT, gid INT, fullname STRING, hdir STRING, shell STRING ) ROW FORMAT DELIMITED FIELDS TERMINATED BY ':' STORED AS TEXTFILE;
0: jdbc:hive2://> LOAD DATA INPATH '/tmp/passwd' OVERWRITE INTO TABLE userinfo;
0: jdbc:hive2://> select uname,fullname,hdir from userinfo order by unmame;
MapReduce Jobs Launched:
Stage-Stage-1: Map: 1 Reduce: 1 Cumulative CPU: 27.83 sec HDFS Read: 8767 HDFS Write: 1454 SUCCESS
Total MapReduce CPU Time Spent: 27 seconds 830 msec
OK
+----------------+-------------------------------+-------------------------------+--+
| uname | fullname | hdir |
+----------------+-------------------------------+-------------------------------+--+
| abrt | | /etc/abrt |
| adm | adm | /var/adm |
| apache | Apache | /var/www |
| avahi-autoipd | Avahi IPv4LL Stack | /var/lib/avahi-autoipd |
| bin | bin | /bin |
| cloudera | | /home/cloudera |
| cloudera-scm | Cloudera Manager | /var/lib/cloudera-scm-server |
...

总结

  • 使用beeline进行对Hive交互访问,类似于sqlplus之于Oracle数据库
  • 其它的交互工作好包括:Hive CLI, Hcatalog, WebHcat
  • 相应的DDL, DML语法可以参考官方WIKI
上一篇:每日一题-——最长公共子序列(LCS)与最长公共子串


下一篇:阿里云oss怎么上传文件夹