Spark以yarn-client提交任务时报错超时,Connection to 192.168.. /has been quiet forms while there are outstanding requests. Failed to send RPC.....

Spark以yarn-client提交任务时报错超时,Connection to 192.168.. /has been quiet forms while there are outstanding requests.  Failed to send RPC.....

报错信息如上,具体是运行FusionInsight给的样例SparkPi,在local环境下是可以的,但是如果以yarn-client模式就会卡住,然后120s以后超时,其实以yarn-cluster模式也是会报错的,开始在spark-default-conf

中加上了driver的spark.driver.host = $客户端IP,没用,将服务器各个主机免密登陆,没用,再将客户端的ip添加到主机的hosts文件中,使得hostname就可以直接访问,没用,再将客户端机器的防火墙关闭,hosts文件ip也映射,没用,一般百度得都是说将yarn-site.xml中加入两个节点,

<property>
<name>yarn.nodemanager.pmem-check-enabled</name>
<value>false</value>
</property>
<property>
<name>yarn.nodemanager.vmem-check-enabled</name>
<value>false</value>
</property>

相信我,一般不是内存不够,错误原因不一样,没用。

最后,还是得靠国外的*看到了光明。

原来是我的虚拟机是单核的,设置参数spark.rpc.netty.dispatcher.numThreads=2即可!

Spark以yarn-client提交任务时报错超时,Connection to 192.168.. /has been quiet forms while there are outstanding requests.  Failed to send RPC.....

Spark以yarn-client提交任务时报错超时,Connection to 192.168.. /has been quiet forms while there are outstanding requests.  Failed to send RPC.....

Spark以yarn-client提交任务时报错超时,Connection to 192.168.. /has been quiet forms while there are outstanding requests.  Failed to send RPC.....

学到最明显的是以后技术问题,特别是较新的技术,需要在*上搜索啊。

上一篇:通过批处理(bat)命令创建mysql数据库及用户等


下一篇:python,类和对象(二),self 、__init__(self,param[,param...])、__private(私有变量)