hive 安装 http://www.open-open.com/lib/view/open1330908496483.html
hive安装配置
可安装集群任意节点上
将解压后的hive-0.8.1文件放在系统的/opt/soft/hive中。
在/etc/profile中添加:
export HIVE_HOME=/opt/soft/hive
export PATH=$HIVE_HOME/bin:$PATH
修改hive-0.8.1目录下/conf/hive-env.sh.template中的HADOOP_HOME为实际的Hadoop安装目录:/opt/hadoop/。
cp hive-env.sh.template hive-env.sh
vim hive-env.sh
export JAVA_HOME=/opt/soft/jdk
HADOOP_HOME=/opt/soft/hadoop
cp hive-default.xml hive-site.xml
vim hive-site.xml
hive.metastore.warehouse.dir:(HDFS上的)数据目录
hive.exec.scratchdir:(HDFS上的)临时文件目录
连接数据库配置
javax.jdo.option.ConnectionURL:元数据连接字串
javax.jdo.option.ConnectionDriverName:DB连接引擎,MySQL为com.mysql.jdbc.Driver
javax.jdo.option.ConnectionUserName:DB连接用户名
javax.jdo.option.ConnectionPassword:DB连接密码
示例
create database hivedb character set latin1;
grant all privileges on hivedb.* to hiveuser@'%' identified by 'hiveuser';
将mysql-connector-java-*.jar拷贝到 hive/lib
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://10.58.120.7:3310/hivedb?useUnicode=true&characterEncoding=UTF-8&createDatabaseIfNotExist=true</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>Driver class name for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hiveuser</value>
<description>username to use against metastore database</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hiveuser</value>
<description>password to use against metastore database</description>
</property>
hive 权限
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Authorization
启动Hive时报错如下:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
解决办法:
vim hadoop-env.sh
export HADOOP_CLASSPATH=${HBASE_HOME}/hbase-0.90.3.jar:${HBASE_HOME}/hbase-0.90.3-test.jar:${HBASE_HOME}/conf:${HBASE_HOME}/lib/zookeeper-3.3.2.jar:${HBASE_HOME}/lib/guava-r06.jar
将其修改为:
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:export HADOOP_CLASSPATH=${HBASE_HOME}/hbase-0.90.3.jar:${HBASE_HOME}/hbase-0.90.3-test.jar:${HBASE_HOME}/conf:${HBASE_HOME}/lib/zookeeper-3.3.2.jar:${HBASE_HOME}/lib/guava-r06.jar(注意:只需在原配置语句上加$HADOOP_CLASSPATH:既可)
创建外部分区表:
CREATE EXTERNAL table t4(id int, name string ,count int)
partitioned by(adddate string)
row format delimited
fields terminated by '\t'
STORED AS TEXTFILE LOCATION '/test/user4' ;
hadoop fs -put /work/user.txt /test/user4/user_20120404/user
hadoop fs -put /work/user.txt /test/user4/user_20120405/user
ALTER TABLE t4 ADD PARTITION ( adddate=' 20120404 ') location '/test/user4/user_20120404' ;
ALTER TABLE t4 ADD PARTITION ( adddate=' 20120405 ') location '/test/user4/user_20120405' ;
查看分区 show partitions