最近在用HFileOutputFormat2进行HBase bulk load数据时出现了以下错误,记录下供有相同错误的童鞋们查阅。
异常1:java.io.IOException: Mkdirs failed to create /user/adorechen/hbase-staging
Exception in thread "main" java.io.IOException: Mkdirs failed to create /user/adorechen/hbase-staging (exists=false, cwd=file:/Users/adorechen/IdeaProjects/hbase-test)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:455)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:440)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1135)
at org.apache.hadoop.io.SequenceFile$RecordCompressWriter.<init>(SequenceFile.java:1441)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:275)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:297)
at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.writePartitions(HFileOutputFormat2.java:407)
at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configurePartitioner(HFileOutputFormat2.java:675)
at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:518)
at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:477)
at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:458)
at com.coupang.HFileOutputFormat2Use$.main(HFileOutputFormat2Use.scala:42)
at com.coupang.HFileOutputFormat2Use.main(HFileOutputFormat2Use.scala)
看exception字面意思是不能创建某个目录,于是我用
hdfs dfs -mkdir -p /user/adorechen/hbase-staging
创建目录,发现木有问题,可以创建。再次运行还是报同样的错误,那问题出在哪里呢?
错误提示里发现:
cwd=file:/Users/adorechen/IdeaProjects/hbase-test
cwd: current work directory。居然是本地文件系统,怀疑dfs的配置没有加载,于是打印conf的配置信息发现:
fs.defaultFS=file:///
我HDFS运行在伪分布式模式下,defaultFS=hdfs://localhost:9000 验证了Hadoop配置没有加载。
解决方案: copy $HADOOP_HOME/etc/hadoop 目录下配置文件 core-site.xml, hdfs-site.xml 到$PROJECT_DIR/src/main/resources目录下,问题解决。
2 在进行HFile数据导入时,抛出异常:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/filter/Filter
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.<init>(P