【原创】大叔经验分享(130)docker容器访问hdfs报错UnknownHostException

docker容器访问hdfs报错UnknownHostException,报错信息如下:

java.lang.RuntimeException: java.net.UnknownHostException: Invalid host name: local host is: (unknown); destination host is: "namenode1":8020; java.net.UnknownHostException; For more details see:  http://wiki.apache.org/hadoop/UnknownHost
	at org.apache.gobblin.configuration.SourceState.materializeWorkUnitAndDatasetStates(SourceState.java:252)
	at org.apache.gobblin.configuration.SourceState.getPreviousWorkUnitStatesByDatasetUrns(SourceState.java:224)
	at org.apache.gobblin.source.extractor.extract.kafka.KafkaSource.getAllPreviousOffsetState(KafkaSource.java:664)
	at org.apache.gobblin.source.extractor.extract.kafka.KafkaSource.getPreviousOffsetForPartition(KafkaSource.java:617)
	at org.apache.gobblin.source.extractor.extract.kafka.KafkaSource.getWorkUnitForTopicPartition(KafkaSource.java:438)
	at org.apache.gobblin.source.extractor.extract.kafka.KafkaSource.getWorkUnitsForTopic(KafkaSource.java:389)
	at org.apache.gobblin.source.extractor.extract.kafka.KafkaSource.access$600(KafkaSource.java:82)
	at org.apache.gobblin.source.extractor.extract.kafka.KafkaSource$WorkUnitCreator.run(KafkaSource.java:901)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.UnknownHostException: Invalid host name: local host is: (unknown); destination host is: "namenode1":8020; java.net.UnknownHostException; For more details see:  http://wiki.apache.org/hadoop/UnknownHost
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:744)
	at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:409)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1518)
	at org.apache.hadoop.ipc.Client.call(Client.java:1451)
	at org.apache.hadoop.ipc.Client.call(Client.java:1412)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
	at com.sun.proxy.$Proxy13.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
	at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1426)
	at org.apache.gobblin.runtime.FsDatasetStateStore.getLatestDatasetStatesByUrns(FsDatasetStateStore.java:296)
	at org.apache.gobblin.runtime.CombinedWorkUnitAndDatasetStateGenerator.getCombinedWorkUnitAndDatasetState(CombinedWorkUnitAndDatasetStateGenerator.java:59)
	at org.apache.gobblin.configuration.SourceState.materializeWorkUnitAndDatasetStates(SourceState.java:246)
	... 12 more
Caused by: java.net.UnknownHostException
	at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:410)

尝试以下方式:

  • 将宿主机的/etc/hosts映射到容器内
  • 添加--add-host=namenode1:192.168.0.1
  • 添加--net=host

问题依旧,根据官方文档提示,修改/etc/hosts文件,将

192.168.0.1 namenode1

改为(最后添加一个点)

192.168.0.1 namenode1 namenode1.

问题修复

参考:http://wiki.apache.org/hadoop/UnknownHost

上一篇:Leetcode 130. 被围绕的区域 (每日一题 20210720 同类型题)


下一篇:【hadoop】-linux下配置配置主机名