xglys 发表于 2018-10-29 10:18:04

hadoop&spark安装(下)

#把/usr/local/spark 和 /usr/local/scala打包,然后复制到slave节点、  
cd /usr/local
  
tar -zcf ~/master.spark.tar.gz./spark
  
tar -zcf ~/master.scala.tar.gz./scala
  
scp master.spark.tar.gz hddcluster1:~
  
scp master.scala.tar.gz hddcluster1:~
  
#登录各个节点进行解压到/usr/local
  
tar -zxf master.spark.tar.gz -C /usr/local/
  
tar -zxf master.scala.tar.gz -C /usr/local/
  
chown -R hadoop:hadoop /usr/local/spark
  
chown -R hadoop:hadoop /usr/local/scala
  
再配置.bashrc环境变量和master的一样。
  
加上hadoop上一篇的.bashrc内容是这样子:
  
#scala
  
export SCALA_HOME=/usr/local/scala
  
#spark
  
export SPARK_HOME=/usr/local/spark
  

  
#java and hadoop
  
export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.111-2.b15.el7_3.x86_64
  
export HADOOP_HOME=/usr/local/hadoop
  
export HADOOP_INSTALL=$HADOOP_HOME
  
export HADOOP_MAPRED_HOME=$HADOOP_HOME
  
export HADOOP_COMMON_HOME=$HADOOP_HOME
  
export HADOOP_HDFS_HOME=$HADOOP_HOME
  
export YARN_HOME=$HADOOP_HOME
  
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
  

  

  
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin:$SCALA_HOME/bin:$SPARK_HOME
  
export HADOOP_PREFIX=$HADOOP_HOME
  
export HADOOP_OPTS="-Djava.library.path=$HADOOP_PREFIX/lib:$HADOOP_PREFIX/lib/native"
  
到此Spark集群搭建完毕


页: [1]
查看完整版本: hadoop&spark安装(下)