HADOOP HIVE 版本
lrwxrwxrwx. 1 hadoop hadoop 12 May 14 09:53 hadoop -> hadoop-2.2.0
drwxr-xr-x. 10 hadoop hadoop 4096 May 14 16:45 hadoop-2.2.0
lrwxrwxrwx. 1 hadoop hadoop 15 May 16 15:38 hive -> hive-0.12.0-bin
drwxrwxr-x. 8 hadoop hadoop 4096 May 16 15:38 hive-0.12.0-bin
pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.cn.hive</groupId>
<artifactId>HiveTest</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.2.0</version>
<classifier>tests</classifier>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>0.12.0</version>
</dependency>
</dependencies>
</project>
网上说的只需要添加hadoop-common跟hive-jdbc的依赖 我试过是不行的
否则会报找不到hadoop writable classnotfound的错误
需要加上hadoop-core
然后启动 hiveserver
[hadoop@localhost Desktop]$ hive --service hiveserver -p 10002
Starting Hive Thrift Server
14/06/06 15:25:43 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive
14/06/06 15:25:43 INFO Configuration.deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize
14/06/06 15:25:43 INFO Configuration.deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize
14/06/06 15:25:43 INFO Configuration.deprecation: mapred.min.split.size.per.r