Hypertable默认支持CDH3和CDH4的HDFS。
如果想自己用指定的HDFS怎么办呢。。。下面以hadoop2.2.0为例来解决。
1. 配置中指定使用CDH4
2. 更换CDH4同HDFS交互的JAR
/hypertable/current/lib/java/下于haddop2.2.0不兼容的有:
hadoop-auth-2.2.0.jar
hadoop-common-2.2.0.jar
hadoop-hdfs-2.2.0.jar
hadoop-mapreduce-client-core-2.2.0.jar
protobuf-java-2.5.0.jar
3. 修改DfsBroker中不兼容的代码HadoopBroker.java
/*DFSClient.DFSDataInputStream in =
(DFSClient.DFSDataInputStream)mFilesystem.open(path);*/
HdfsDataInputStream in =
(HdfsDataInputStream)mFilesystem.open(path);
4. 重新编译打包hypertable-0.9.7.13.jar,放到/hypertable/current/lib/java/中。
5. 重新启动,一切OK