搭建spark本地环境
搭建Java环境
(1)到官网下载JDK
官网链接:https://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html
(2)解压缩到指定的目录
>sudo tar -zxvf jdk-8u91-linux-x64.tar.gz -C /usr/lib/jdk //版本号视自己安装的而定
(3)设置路径和环境变量
>sudo vim /etc/profile
在文件的最后加上
export JAVA_HOME=/usr/lib/jdk/jdk1.8.0_91 export JRE_HOME=${JAVA_HOME}/jre export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib export PATH=${JAVA_HOME}/bin:$PATH
(4)让配置生效
source /etc/profile
(5)验证安装是否成功
~$ java -version java version "1.8.0_181" Java(TM) SE Runtime Environment (build 1.8.0_181-b13) Java HotSpot(TM) 64-Bit Server VM (build 25.181-b13, mixed mode)
安装Scala
(1)到官网下载安装包
官网链接:https://www.scala-lang.org/download/
(2)解压缩到指定目录
sudo tar -zxvf scala-2.11.8.tgz -C /usr/lib/scala //版本号视自己安装的而定
(3)设置路径和环境变量
>sudo vim /etc/profile
在文件最后加上
export SCALA_HOME=/usr/lib/scala/scala-2.11.8 //版本号视自己安装的而定 export PATH=${SCALA_HOME}/bin:$PATH
(4)让配制生效
source /etc/profile
(5)验证安装是否成功
:~$ scala Welcome to Scala 2.12.6 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_181). Type in expressions for eva luation. Or try :help. scala>
安装Spark
(1)到官网下载安装包
官网链接:http://spark.apache.org/downloads.html
(2)解压缩到指定目录
sudo tar -zxvf spark-1.6.1-bin-hadoop2.6.tgz -C /usr/lib/spark //版本号视自己安装的而定
(3)设置路径和环境变量
>sudo vim /etc/profile
在文件最后加上
export SPARK_HOME=/usr/lib/spark/spark-1.6.1-bin-hadoop2.6 export PATH=${SPARK_HOME}/bin:$PATH
(4)让配置生效
source /etc/profile
(5)验证安装是否成功
:~$ cd spark-1.6.1-bin-hadoop2.6 :~/spark-1.6.1-bin-hadoop2.6$ ./bin/spark-shell Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 18/09/30 20:59:31 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 18/09/30 20:59:32 WARN Utils: Your hostname, pxh resolves to a loopback address: 127.0.1.1; using 10.22.48.4 instead (on interface wlan0) 18/09/30 20:59:32 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 18/09/30 20:59:45 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException Spark context Web UI available at http://10.22.48.4:4040 Spark context available as 'sc' (master = local[*], app id = local-1538312374870). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.2.0 /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_181) Type in expressions to have them eva luated. Type :help for more information.
安装sbt
(1)到官网下载安装包
官网链接:https://www.scala-sbt.org/download.html
(2)解压缩到指定目录
tar -zxvf sbt-0.13.9.tgz -C /usr/local/sbt
(3)在/usr/local/sbt 创建sbt脚本并添加以下内容
$ cd /usr/local/sbt $ vim sbt # 在sbt文本文件中添加如下信息: BT_OPTS="-Xms512M -Xmx1536M -Xss1M -XX:+CMSClassUnloadingEnabled -XX:MaxPermSize=256M" java $SBT_OPTS -jar /usr/local/sbt/bin/sbt-launch.jar "$@"
(4)保存后,为sbt脚本增加执行权限
$ chmod u+x sbt
(5)设置路径和环境变量
>sudo vim /etc/profile
在文件最后加上
export PATH=/usr/local/sbt/:$