这篇文章主要介绍了Spark 1.5.1如何安装,具有一定借鉴价值,感兴趣的朋友可以参考下,希望大家阅读完这篇文章之后大有收获,下面让小编带着大家一起了解一下。
一、Hadoop 2.x 安装
Hadoop 2.x安装 http://my.oschina.net/u/204498/blog/519789
二、Spark1.5.1安装
1.下载spark1.5.1
http://spark.apache.org/downloads.html
选择spark的版本
[hadoop@hftclclw0001 ~]$ pwd
/home/hadoop
[hadoop@hftclclw0001 ~]$ wget
[hadoop@hftclclw0001 ~]$ ll
total 480004
drwxr-xr-x 11 hadoop root 4096 Jan 17 04:54 hadoop-2.7.1
-rw------- 1 hadoop root 210606807 Jan 17 04:09 hadoop-2.7.1.tar.gz
drwxr-xr-x 13 hadoop root 4096 Jan 18 08:31 spark-1.5.1-bin-hadoop2.6
-rw------- 1 hadoop root 280901736 Jan 17 04:08 spark-1.5.1-bin-hadoop2.6.tgz
[hadoop@hftclclw0001 conf]$ pwd
/home/hadoop/spark-1.5.1-bin-hadoop2.6/conf
[hadoop@hftclclw0001 conf]$ cp slaves.template slaves -p
[hadoop@hftclclw0001 conf]$ vi slaves => slaves节点
hfspark0003.webex.com
hfspark0007.webex.com
[hadoop@hftclclw0001 conf]$ cp spark-env.sh.template spark-env.sh -p
[hadoop@hftclclw0001 conf]$ vi spark-env.sh => 配置spark环境变量
...
export SPARK_CLASSPATH=$SPARK_CLASSPATH:/home/hadoop/spark-1.5.1-bin-hadoop2.6/lib/mysql-connector-java-5.1.25-bin.jar:/home/hadoop/spark-1.5.1-bin-hadoop2.6/lib/ojdbc6.jar =>使用spark sql
...
export HADOOP_HOME=/home/hadoop/hadoop-2.7.1
export HADOOP_CONF_DIR=/home/hadoop/hadoop-2.7.1/etc/hadoop
export SPARK_MASTER_IP=hftclclw0001.webex.com
export SPARK_WORKER_MEMEORY=4g
export JAVA_HOME=/usr/java/
...
二、复制到其他的机器上
[hadoop@hftclclw0001 ~]$ pwd
/home/hadoop
[hadoop@hftclclw0001 ~]$ scp -r spark-1.5.1-bin-hadoop2.6 hadoop@{ip}:/home/hadoop
三、启动
[hadoop@hfspark0003 spark-1.5.1-bin-hadoop2.6]$ ./sbin/start-all.sh
...
...
四、校验
a. jps ==> master,worker
b.webui ==> http://${ip}:8080
感谢你能够认真阅读完这篇文章,希望小编分享的“Spark 1.5.1如何安装”这篇文章对大家有帮助,同时也希望大家多多支持天达云,关注天达云行业资讯频道,更多相关知识等着你来学习!