一、安装ant、maven
(1)首先下载ant,maven的安装包
apache-ant-1.9.4-bin.zip
apache-maven-3.3.9-bin.zip
(2)配置环境变量
添加系统环境变量
ANT_HOME=D:\software\apache-ant-1.9.4
MAVEN_HOME=D:\software\apache-maven-3.3.9
Path, 将 " ;%ANT_HOME%\bin;%MAVEN_HOME%\bin " ,引号追加在Path的路径后面
(3)进入cmd,验证是否成功
ant -version
mvn -version
二、安装protoc
1、首先下载protobuf-2.5.0.tar.gz 和 protoc-2.5.0-win32.zip 两个包。分别解压到各自目录
2.将protoc-2.5.0-win32中的protoc.exe拷贝到c:\windows\system32中。
3.将protoc.exe文件拷贝到解压后的D:\software\protobuf-2.5.0\src目录中.
4.cmd进入D:\software\protobuf-2.5.0\java 目录
执行mvn package命令.
## 生成protobuf-java-2.5.0.jar文件(位于target目录中)
验证:
protoc --version
libprotoc 2.5.0
三、下载hadoop源码
1.下载hadoop-2.6.0-src.tar.gz,解压至当前目录D:\software\
2.cmd命令行,进入D:\software\hadoop-2.6.0-src\hadoop-maven-plugins,运行mvn install 命令
3.cmd命令行,进入D:\software\hadoop-2.6.0-src,运行mvn eclipse:eclipse -DskipTests 命令
四、导入hadoop源码
通过eclipse中的Import将源码导入到eclipse中
五、Error
error#1. hadoop-streaming里面的build path有问题,显示/root/workspace/hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/conf(missing)
解决办法,remove掉引用就好。
error#2. hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/testdfsclientfailover.java中报sun.net.spi.nameservice.nameservice错误,这是一个需要import的包,存在于openjdk中,在oracle jdk中没找到,需要下载一个。nameservice是一个接口,在网上找一个nameservice放到该包中就好。 http://grepcode.com/file/repository.grepcode.com/java/root/jdk/openjdk/7u40-b43/sun/net/spi/nameservice/nameservice.java#nameservice
error#3. /hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineeditsviewer/xmleditsvisitor.java里面显示
import com.sun.org.apache.xml.internal.serialize.outputformat;
import com.sun.org.apache.xml.internal.serialize.xmlserializer;
失败,这是由于eclipse的强检查原则,打开java -> compiler -> errors/warnings and under "deprecated and restricted api" change the setting of "forbidden reference (access rules)" 将error级别调整到warning级别就好。
error#4. /hadoop-common/src/test/java/org/apache/hadoop/io/serializer/avro/testavroserialization.java显示没有avrorecord类,在网上搜索到avrorecord类放入到同级包中就行了。 http://grepcode.com/file/repo1.maven.org/maven2/org.apache.hadoop/hadoop-common/2.2.0/org/apache/hadoop/io/serializer/avro/avrorecord.java#avrorecord
error#5. org.apache.hadoop.ipc.protobuf包是空的,需要在/hadoop-common/target/generated-sources/java中找到profobuf拷贝到/hadoop-common/src/test/java中就好了. 同时包里面还缺少了以下三个引用,在grepcode上找一下,把hadoop-common2.2.0的相应文件下下来导入。
org.apache.hadoop.ipc.protobuf.testprotos.echorequestproto;
org.apache.hadoop.ipc.protobuf.testprotos.echoresponseproto;
org.apache.hadoop.ipc.protobuf.testrpcserviceprotos.testprotobufrpcproto;
error#6. /hadoop-auth/org/apache/hadoop/security/authentication/client/authenricatortestcase.java中显示server.start()和server.stop()错误,还没找到原因所在,待检查~~~
参考:http://www.linuxidc.com/Linux/2015-05/117705.htm
http://www.makaidong.com/博客园排行/3097.shtml