下载hive安装包:https://archive.apache.org/dist/hive/hive-1.2.1/
我的spark是spark-2.2.0-bin-2.6.0-cdh5.7.0,hadoop是hadoop-2.6.0-cdh5.7.0 ,jdk是1.8
安装过程很曲折:
报错很多:原因是mysql数据库装得不对
第一步:安装mysql
ubuntu更换成阿里的镜像源
deb http://mirrors.aliyun.com/ubuntu/ xenial-updates main
deb-src http://mirrors.aliyun.com/ubuntu/ xenial-updates main
deb http://mirrors.aliyun.com/ubuntu/ xenial universe
deb-src http://mirrors.aliyun.com/ubuntu/ xenial universe
deb http://mirrors.aliyun.com/ubuntu/ xenial-updates universe
deb-src http://mirrors.aliyun.com/ubuntu/ xenial-updates universe
deb http://mirrors.aliyun.com/ubuntu/ xenial-security main
deb-src http://mirrors.aliyun.com/ubuntu/ xenial-security main
deb http://mirrors.aliyun.com/ubuntu/ xenial-security universe
deb-src http://mirrors.aliyun.com/ubuntu/ xenial-security universe
sudo apt-get upgrade
apt-get install mysql-server 记得设置密码
更改/etc/mysql/mysql.conf.d下的mysqld.cnf文件
将bind-address = 0.0.0.0 改成这样 表示允许任何ip访问mysql
更改完之后service mysql restart
mysql -u root -p 123 进入mysql
给所有用户授权mysql操作,
grant all privileges on *.* to 'root'@'%' identified by '密码';
新建hive数据库
create database hive;
第二步安装hive:
解压hive 配置环境变量HIVE_HOME
将mysql-connector-java-8.0.11.jar包加入到lib文件下
vi hive-env.sh文件 加入hadoop_home路径
新建hive-site.xml文件 配置
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>密码</value>
</property>
</configuration>
如果在启动的时候遇到找不到spark的spark-assembly-*.jar
则更改bin/hive文件中的
sparkAssemblyPath=`ls ${SPARK_HOME}/lib/spark-assembly-*.jar`
改为:sparkAssemblyPath=`ls ${SPARK_HOME}/jars/*.jar`
最后启动bin下的hive
最后画面是