sqoop将mysql表导入hdfs

sqoop将mysql表导入hdfs


sqoop help

  codegen            Generate code to interact with database records
  create-hive-table  Import a table definition into Hive
  eval               Evaluate a SQL statement and display the results
  export             Export an HDFS directory to a database table
  help               List available commands
  import             Import a table from a database to HDFS
  import-all-tables  Import tables from a database to HDFS
  import-mainframe   Import datasets from a mainframe server to HDFS
  job                Work with saved jobs
  list-databases     List available databases on a server
  list-tables        List available tables in a database
  merge              Merge results of incremental imports
  metastore          Run a standalone Sqoop metastore
  version            Display version information
sqoop import --connect jdbc:mysql://cdh3:3306/gmall --username root --password 123456 --table user_info 
--columns id,login_name where "id >=10 and id<=30" --target-dir /test --delete-target-dir

1 --connect jdbc:mysql://cdh3:3306/gmall 数据库的连接信息
2 --username root --password 123456 数据库的用户名密码
3 --table user_info 数据库的表
4 --columns id,login_name 数据库列,指定导入哪几列
5 where “id >=10 and id<=30” 指定条件,导入id大于10小于30的数据
6 --target-dir /test 指定hdfs目录
7 --delete-target-dir 如果输出路径存在则删除

上一篇:本地文件导入mysql


下一篇:Sqoop安装教程