failed to launch: nice -n 0 /home/hadoop/spark-2.3.3-bin-hadoop2.7/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://namenode1:7077

spark2.3.3安装完成之后启动报错:

[hadoop@namenode1 sbin]$ ./start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /home/hadoop/spark-2.3.3-bin-hadoop2.7/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-namenode1.out
datanode2: starting org.apache.spark.deploy.worker.Worker, logging to /home/hadoop/spark-2.3.3-bin-hadoop2.7/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-datanode2.out
datanode3: starting org.apache.spark.deploy.worker.Worker, logging to /home/hadoop/spark-2.3.3-bin-hadoop2.7/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-datanode3.out
datanode1: starting org.apache.spark.deploy.worker.Worker, logging to /home/hadoop/spark-2.3.3-bin-hadoop2.7/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-datanode1.out
datanode1: failed to launch: nice -n 0 /home/hadoop/spark-2.3.3-bin-hadoop2.7/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://namenode1:7077
datanode1: Spark Command: /usr/java/jdk1.8.0_111/bin/java -cp /home/hadoop/spark-2.3.3-bin-hadoop2.7/conf/:/home/hadoop/spark-2.3.3-bin-hadoop2.7/jars/* -Xmx1g org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://namenode1:7077
datanode1: ========================================
datanode1: full log in /home/hadoop/spark-2.3.3-bin-hadoop2.7/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-datanode1.out

failed to launch: nice -n 0 /home/hadoop/spark-2.3.3-bin-hadoop2.7/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://namenode1:7077

提示说完整的日志路径在哪里,用cat +路径的形式去查看,没有错误,只有警告,我选择stop-all然后重启,通常第二遍启动正常。

上一篇:Java系列笔记(3) - Java 内存区域和GC机制


下一篇:[svc]raid基础知识-冷知识