hbase snappy 安装_hbase自带snappy压缩测试出错

在执行HBase的Snappy压缩测试时遇到权限错误,通过添加`core-site.xml`配置或设置`HADOOP_CONF_DIR`环境变量,成功解决问题。测试命令从`hbase org.apache.hadoop.hbase.util.CompressionTest /ab.txt snappy`修改为使用HDFS路径或添加相关配置。
摘要由CSDN通过智能技术生成

今天在验证snappy能否可用的时候遇到一个问题,执行hbase自带的压缩测试方法hbase

org.apache.hadoop.hbase.util.CompressionTest /a.txt snappy报错,信息如下

hbase org.apache.hadoop.hbase.util.CompressionTest /ab.txt

snappy

14/04/25 14:50:55 INFO Configuration.deprecation:

hadoop.native.lib is deprecated. Instead, use

io.native.lib.available SLF4J: Class path contains multiple SLF4J

bindings. SLF4J: Found binding in

[jar:file:/data1/cdh5/app/hbase/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in

[jar:file:/data1/cdh5/app/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an

explanation. 14/04/25 14:50:57 INFO util.ChecksumType: Checksum

using org.apache.hadoop.util.PureJavaCrc32 14/04/25 14:50:57 INFO

util.ChecksumType: Checksum can use

org.apache.hadoop.util.PureJavaCrc32C 14/04/25 14:50:57 DEBUG

util.FSUtils: Creating file=/ab.txt with permission=rwxrwxrwx

14/04/25 14:50:57 INFO hbase.HBaseFileSystem: Create Path with

Perms, sleeping 1000 times 1 14/04/25 14:50:58 INFO

hbase.HBaseFileSystem: Create Path with Perms, sleeping 1000 times

2 14/04/25 14:51:00 INFO hbase.HBaseFileSystem: Create Path with

Perms, sleeping 1000 times 3 14/04/25 14:51:03 INFO

hbase.HBaseFileSystem: Create Path with Perms, sleeping 1000 times

4 14/04/25 14:51:07 INFO hbase.HBaseFileSystem: Create Path with

Perms, sleeping 1000 times 5 14/04/25 14:51:12 INFO

hbase.HBaseFileSystem: Create Path with Perms, sleeping 1000 times

6 14/04/25 14:51:18 INFO hbase.HBaseFileSystem: Create Path with

Perms, sleeping 1000 times 7 14/04/25 14:51:25 INFO

hbase.HBaseFileSystem: Create Path with Perms, sleeping 1000 times

8 14/04/25 14:51:33 INFO hbase.HBaseFileSystem: Create Path with

Perms, sleeping 1000 times 9 14/04/25 14:51:42 INFO

hbase.HBaseFileSystem: Create Path with Perms, sleeping 1000 times

10 14/04/25 14:51:52 WARN hbase.HBaseFileSystem: Create Path with

Perms, retries exhausted Exception in thread "main"

java.io.IOException: Exception in createPathWithPermsOnFileSystem

at

org.apache.hadoop.hbase.HBaseFileSystem.createPathWithPermsOnFileSystem(HBaseFileSystem.java:218)

at org.apache.hadoop.hbase.util.FSUtils.create(FSUtils.java:157) at

org.apache.hadoop.hbase.util.FSUtils.create(FSUtils.java:133) at

org.apache.hadoop.hbase.io.hfile.AbstractHFileWriter.createOutputStream(AbstractHFileWriter.java:271)

at

org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:398)

at

org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:108)

at

org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:138)

Caused by: java.io.FileNotFoundException: /ab.txt (权限不够) at

java.io.FileOutputStream.open(Native Method) at

java.io.FileOutputStream.(FileOutputStream.java:212)

at

org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.(RawLocalFileSystem.java:206)

at

org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.(RawLocalFileSystem.java:202)

at

org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:265)

at

org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:252)

at

org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.(ChecksumFileSystem.java:384)

at

org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:443)

at

org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:424)

at

org.apache.hadoop.hbase.HBaseFileSystem.createPathWithPermsOnFileSystem(HBaseFileSystem.java:210)

... 6 more,

但是创建snappy压缩方式的hbase表,是可以成功的,并且可以正常加载,也就是snappy是可以使用的。注意到报错信息中,其中有RawLocalFileSystem信息,所以尝试将命令修改为

hbase org.apache.hadoop.hbase.util.CompressionTest

hdfs://cdh5cluster/ab.txt snappy

返回SUCCESS

查看了hbase的conf目录下,没有core-site.xml配置,将配置加上后,hbase

org.apache.hadoop.hbase.util.CompressionTest /ab.txt snappy

返回SUCCESS,看来真是hbase中没有获取到集群的信息造成的。

PS.还有一种解决方法,在hbase-env.sh中添加HADOOP_CONF_DIR环境变量

export

HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoop,这样可以不必将core-site.xml和hdfs-site.xml配置拷贝到hbase的conf目录下,直接识别集群信息,很方便啊,又收获了一个小tip,谢谢凯哥

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值