上载数据时出错: KeyValue size too large

上载数据时出错: KeyValue size too large
问题如下:

Exception in thread "main" java.lang.IllegalArgumentException: KeyValue size too large at 
org.apache.hadoop.hbase.client.HTable.validatePut(HTable.java:1545) at 
org.apache.hadoop.hbase.client.BufferedMutatorImpl.validatePut(BufferedMutatorImpl.java:175) at 
org.apache.hadoop.hbase.client.BufferedMutatorImpl.mutate(BufferedMutatorImpl.java:146) at 
mil.nga.giat.geowave.datastore.hbase.operations.HBaseWriter.writeMutations(HBaseWriter.java:94) at 
mil.nga.giat.geowave.datastore.hbase.operations.HBaseWriter.write(HBaseWriter.java:88) at 
mil.nga.giat.geowave.datastore.hbase.operations.HBaseWriter.write(HBaseWriter.java:81) at 
mil.nga.giat.geowave.core.store.base.BaseIndexWriter.write(BaseIndexWriter.java:99) at 
mil.nga.giat.geowave.core.store.base.BaseIndexWriter.write(BaseIndexWriter.java:72) at 
mil.nga.giat.geowave.core.store.index.writer.IndexCompositeWriter.write(IndexCompositeWriter.java:49) at 
com.cwgis.importFeatureHBase.importData(importFeatureHBase.java:120) at com.cwgis.App.import_db(App.java:115) at 
com.cwgis.App.main(App.java:51)

hbase. client. keyvalue. maxsize的设置问题

我的解决办法:
集群参数hbase-site.xml文件中的设置如下:

<!--keyval设置最大值500M-->
<property>
   <name>hbase.client.keyvalue.maxsize</name>
   <value>524288000</value>
</property>
<property>
  <name>hbase.server.keyvalue.maxsize</name>
  <value>524288000</value>    
</property>

重启hbase集群
客户端的lib库中修改hbase-shaded-client-1.4.6.jar和hbase-shaded-server-1.4.6.jar文件中hbase-default.xml参数

<property>
   <name>hbase.client.keyvalue.maxsize</name>
   <value>524288000</value>
</property>
<property>
  <name>hbase.server.keyvalue.maxsize</name>
  <value>524288000</value>    
</property>

—the—end----

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值