在写Hfile的时候 ,如果一个family下超过了默认的32个hfile,就会报如下错误:
ERROR mapreduce.LoadIncrementalHFiles: Trying to load more than hfiles to family d of region with start key
Exception in thread "main" java.io.IOException: Trying to load more than hfiles to one family of one region
at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.doBulkLoad(LoadIncrementalHFiles.java:)
at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.run(LoadIncrementalHFiles.java:)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:)
at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.main(LoadIncrementalHFiles.java:)
解决:
在Hbase-site.xml中添加:
hbase.mapreduce.bulkload.max.hfiles.perRegion.perFamily
hbase shell 启动报错
启动hbas