Ubuntu 设备 spark

周围环境:

Unbunt 12.04

Hadoop 2.2.x

Sprak 0.9

Scala scala-2.9.0.final.tgz

一步

1. 下载 scala

2. 解压scala,然后改动/etc/profile,加入例如以下

export SCALA_HOME=/home/software/scala-2.9.0.final

export PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin:$HADOOP_HOME/bin:/home/software/eclipse:$ANT_HOME/bin:$SQOOP_HOME/bin:$SCALA_HOME/bin:$SPARK_HOME/bin

3.下载spark

版本号:spark-0.9.0-incubating-bin-hadoop2.tgz

4. 改动/etc/profile

export SPARK_HOME=/opt/spark

export PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin:$HADOOP_HOME/bin:/home/software/eclipse:$ANT_HOME/bin:$SQOOP_HOME/bin:$SCALA_HOME/bin:$SPARK_HOME/bin

5. 进入conf/,做例如以下改动

cp spark-env.sh.template spark-env.sh

vim spark-env.sh

export SCALA_HOME=/home/software/scala-2.9.0.final
export JAVA_HOME=/home/software/jdk1.7.0_55
export SPARK_MASTER_IP=172.16.2.104
export SPARK_WORKER_MEMORY=1000m

6.vim conf/slaves

localhost

datanode1

7.启动/关闭spark

sbin/start-all.sh

8.浏览master ui

http://robinson-ubuntu:8080

9.执行例子

run-example org.apache.spark.examples.SparkPi local

10.执行例子

run-example org.apache.spark.examples.SparkPi spark://172.16.2.104:7077

11.执行例子

run-example org.apache.spark.examples.SparkLR spark://172.16.2.104:7077

參考:
http://www.tuicool.com/articles/NB3imuY
http://blog.csdn.net/myboyliu2007/article/details/17174363

版权声明:本文博主原创文章,博客,未经同意不得转载。

上一篇:杂谈之SolrCloud这个坑货


下一篇:如何在Visual Studio 2017中使用C# 7+语法