Ubuntu集成Spark

Ubuntu集成Spark

docker ubuntu下载jdk:

在opt里创建两个目录:software和modules
software用来存放安装包,modules用来存放解压后的文件
首先用命令在官网下载jdk:
wget 'https://download.oracle.com/java/17/latest/jdk-17_linux-x64_bin.tar.gz'
解压到modules目录下,可以给它重新改一个简单的名字,我叫它jdk17

mv jdk-17.0.1 jdk17

然后在etc/profile里追加:

export JAVA_HOME=/opt/modules/jdk17
export JRE_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib
export PATH=${JAVA_HOME}/bin:$PATH

执行profile文件:

source etc/profile

下载scala:

wget'https://downloads.lightbend.com/scala/2.11.8/scala-2.11.8.tgz'

解压.
然后在etc/profile里追加::

export SCALA_HOME=/opt/modules/scala2
export PATH=${SCALA_HOME}/bin:$PATH

执行profile文件:

source etc/profile

测试是否安装成功:

scala -version

安装spark:

wget 'https://dlcdn.apache.org/spark/spark-3.2.0/spark-3.2.0-bin-hadoop3.2-scala2.13.tgz'

解压。
然后在etc/profile里追加::

export SPARK_HOME=/opt/modules/spark
export PATH=${SPARK_HOME}/bin:$PATH

对了,我们可以将三个写在一起:

export JAVA_HOME=/opt/modules/jdk17
export JRE_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib
export SCALA_HOME=/opt/modules/scala2
export SPARK_HOME=/opt/modules/spark

export PATH=${JAVA_HOME}/bin:${SCALA_HOME}/bin:${SPARK_HOME}/bin:${SPARK_HOME}/sbin:$PATH

template文件cp:

root@b94e802c7b46:/opt/modules/spark/conf# cp log4j.properties.template log4j.properties
root@b94e802c7b46:/opt/modules/spark/conf# cp spark-defaults.conf.template spark-defaults.conf
root@b94e802c7b46:/opt/modules/spark/conf# cp spark-env.sh.template spark-env.sh
root@b94e802c7b46:/opt/modules/spark/conf# cp workers.template workers

在root@b94e802c7b46:/opt/modules/spark/conf#
vim spark-env.sh

export JAVA_HOME=/opt/modules/jdk17
export SCALA_HOME=/opt/modules/scala2

验证成功:
Ubuntu集成Spark

上一篇:智能在线客服系统GOFLY开发日志- 1. 最初的想法


下一篇:torch.nn.modules类构建模型