Hadoop Java API 操作 hdfs--1

Hadoop文件系统是一个抽象的概念,hdfs仅仅是Hadoop文件系统的其中之一。

就hdfs而言,访问该文件系统有两种方式:(1)利用hdfs自带的命令行方式,此方法类似linux下面的shell命令;(2)利用hdfs的java接口,通过编写java程序来实现。

操作环境:hadoop-1.0.4,java1.7.0_65,Ubuntu 14.04.1 LTS

 import java.io.InputStream;
import java.net.URI; import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils; public class FileSystemCat { public static void main(String[] args) throws Exception {
String uri = args[0];
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(uri), conf);
InputStream in = null;
try {
in = fs.open(new Path(uri));
IOUtils.copyBytes(in, System.out, 4096, false);
} finally {
IOUtils.closeStream(in);
}
} }

气死我了,这个程序都还没运行成功。

一开始,搞不清楚到底需要import那些类,  关于代码中的类需要import哪些package,可以查这个API文档:http://hadoop.apache.org/docs/current/api/index.html

现在能javac编译成功了,但用hadoop filename  还是不能运行,报错提示:

hadoop FileSystemCat hdfs://conf.sh
Error: Could not find or load main class FileSystemCat

气死我了!!!!!!!!!!!!!!!!!!!!!!1

-----------------------------------

我想一定是关于java程序运行,以及classpath的问题,,,,,我需要搞清楚!21:28:54   2014-10-23

------------------------------

问题搞定了,hadoop-env.sh这个文件里面有个CLASSPATH的参数设置,这个设置值要和javac编译生成的.class文件一致  2014-10-23 23:59:53

今天发现,在没有启动hadoop的情况下,居然可以直接启动hbase    2014-10-28  11:12:29

用javac FileSystemCat.java时,会出现很多报错,

stu@master:~$ javac FileSystemCat.java
FileSystemCat.java:4: error: package org.apache.hadoop.conf does not exist
import org.apache.hadoop.conf.Configuration;
^
FileSystemCat.java:5: error: package org.apache.hadoop.fs does not exist
import org.apache.hadoop.fs.FSDataInputStream;
^
FileSystemCat.java:6: error: package org.apache.hadoop.fs does not exist
import org.apache.hadoop.fs.FileSystem;
^
FileSystemCat.java:7: error: package org.apache.hadoop.fs does not exist
import org.apache.hadoop.fs.Path;
^
FileSystemCat.java:8: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.IOUtils;
^
FileSystemCat.java:17: error: cannot find symbol
Configuration conf = new Configuration();
^
symbol: class Configuration
location: class FileSystemCat
FileSystemCat.java:17: error: cannot find symbol
Configuration conf = new Configuration();
^
symbol: class Configuration
location: class FileSystemCat
FileSystemCat.java:18: error: cannot find symbol
FileSystem fs = FileSystem.get(URI.create(uri), conf);
^
symbol: class FileSystem
location: class FileSystemCat
FileSystemCat.java:18: error: cannot find symbol
FileSystem fs = FileSystem.get(URI.create(uri), conf);
^
symbol: variable FileSystem
location: class FileSystemCat
FileSystemCat.java:21: error: cannot find symbol
in = fs.open(new Path(uri));
^
symbol: class Path
location: class FileSystemCat
FileSystemCat.java:22: error: cannot find symbol
IOUtils.copyBytes(in, System.out, 4096, false);
^
symbol: variable IOUtils
location: class FileSystemCat
FileSystemCat.java:24: error: cannot find symbol
IOUtils.closeStream(in);
^
symbol: variable IOUtils
location: class FileSystemCat
12 errors
stu@master:~$

  这时需要在编译的时候,把hadoop里面的相应jar文件设置为classpath参数,即如下就对了:

stu@master:~$ javac -classpath /home/stu/hadoop-1.0.4/hadoop-core-1.0.4.jar FileSystemCat.java

  然后把生成的FileSystemCat.java复制到 hadoop-env.sh里面设定的文件夹下即可。

# Extra Java CLASSPATH elements. Optional.
export HADOOP_CLASSPATH=/home/stu/myclass

上一篇:学习angular.js的一些笔记想法(上)


下一篇:css3百叶窗轮播图效果