Hadoop/Spark分布式集成环境搭建手记
###如果没有安装wget 则yum -y install wget即可######oracle的验证比较烦 所以wget要添加下面一些参数
# wget --no-cookies --no-check-certificate --header "Cookie: oraclelicense=accept-securebackup-cookie" "http://download.oracle.com/otn-pub/java/jdk/8u112-b15/jdk-8u112-linux-x64.rpm"
# ls
anaconda-ks.cfgjdk-8u112-linux-x64.rpm
# rpm -ivh jdk-8u112-linux-x64.rpm
Preparing... #################################
Updating / installing...
1:jdk1.8.0_112-2000:1.8.0_112-fcs#################################
Unpacking JAR files...
tools.jar...
plugin.jar...
javaws.jar...
deploy.jar...
rt.jar...
jsse.jar...
charsets.jar...
localedata.jar...
# java -version
java version "1.8.0_112"
Java(TM) SE Runtime Environment (build 1.8.0_112-b15)
Java HotSpot(TM) 64-Bit Server VM (build 25.112-b15, mixed mode)
# wget
# rpm -ivh scala-2.11.8.rpm
Preparing... #################################
Updating / installing...
1:scala-2.11.8-0 #################################
# scala -version
Scala code runner version 2.11.8 -- Copyright 2002-2016, LAMP/EPFL
# whereis scala
scala: /usr/bin/scala /usr/share/scala /usr/share/man/man1/scala.1.gz
###配置环境变量###
# vi /etc/profile
JAVA_HOME=/usr/java/jdk1.8.0_112
SCALA_HOME=/usr/share/scala
PATH=$PATH:$JAVA_HOME/bin:$SCALA_HOME/bin:$HADOOP_HOME/bin
export JAVA_HOME SCALA_HOME PATH
# source /etc/profile
页:
[1]