1、配置core-site.xml文件
添加如下配置
<property>
<name>hadoop.http.filter.initializers</name>
<value>org.apache.hadoop.security.AuthenticationFilterInitializer</value>
</property>
<property>
<name>hadoop.http.authentication.type</name>
<value>simple</value>
</property>
<property>
<name>hadoop.http.authentication.signature.secret.file</name>
<value>/data1/hadoop/hadoop/etc/hadoop/hadoop-http-auth-signature-secret</value>
</property>
<property>
<name>hadoop.http.authentication.simple.anonymous.allowed</name>
<value>false</value>
</property>
<property>
<name>hadoop.http.authentication.token.max-inactive-interval</name>
<value>60</value>
</property>
参数说明:
参数 | 说明 | 默认值 |
---|---|---|
hadoop.http.filter.initializers | 认证的类 | org.apache.hadoop.security.AuthenticationFilterInitializer |
hadoop.http.authentication.type | 认证类型,有Kerberos和simple | simple |
hadoop.http.authentication.signature.secret.file | 授权用户文件,只有存在于这里面的用户才能访问集群 | $user.home/hadoop-http-auth-signature-secret |
hadoop.http.authentication.simple.anonymous.allowed | 是否允许匿名用户访问 | true |
hadoop.http.authentication.token.max-inactive-interval | 多少秒没有操作,token将过期 | -1 |
2、重启集群
3、认证
把要通过访问的用户添加到/data1/hadoop/hadoop/etc/hadoop/hadoop-http-auth-signature-secret文件里面
浏览器输入:http://192.168.43.15:9870/?user.name=hadoop就可以进行访问了。也就是在url的后面加上?user.name=xxx
这里本来想使用Kerberos作为web认证,但是死活认证失败,web界面不能正常访问,这里就换成了simple认证,后续在研究一下kerberos认证。不确定是否浏览器的这台主机需要有认证主体,才能进行web访问。
https://blog.csdn.net/IUNIQUE/article/details/108615090 这哥们貌似实在window进行了Kerberos认证,可以进行访问web
借鉴:
http://hadoop.apache.org/docs/r3.0.0/hadoop-project-dist/hadoop-common/HttpAuthentication.html
http://hadoop.apache.org/docs/stable/hadoop-auth/Examples.html
http://www.voidcn.com/article/p-bromwkhr-bth.html
https://blog.csdn.net/a822631129/article/details/48630093