解决Hbase启动报错问题:No such file or directory!

应用场景

在Hbase搭建完之后,本想开开心心的启动Hbase,进行测试使用hbase,但是发现启动hbase的时候,报各种各样的错误,java_home,hbase,hadoop等找不到文件或目录,no such file or directory!

[root@hadoop0 bin]# start-hbase.sh 
/opt/hbase1.2.6/conf/hbase-env.sh: line 50: export JAVA_HOME=/opt/jdk1.8: No such file or directory
/opt/hbase1.2.6/conf/hbase-env.sh: line 52: export HBASE_HOME=/opt/hbase1.2.6: No such file or directory
/opt/hbase1.2.6/conf/hbase-env.sh: line 53: export HBASE_CLASSPATH=/opt/hadoop2.6.0/etc/hadoop: No such file or directory
/opt/hbase1.2.6/conf/hbase-env.sh: line 54: export HBASE_PID_DIR=/opt/hbase1.2.6/pids: No such file or directory
starting master, logging to /opt/hbase1.2.6/logs/hbase-root-master-hadoop0.out
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option PermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option PermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option PermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option PermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option PermSize=128m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
hadoop0: /opt/hbase1.2.6/conf/hbase-env.sh: line 50: export JAVA_HOME=/opt/jdk1.8: No such file or directory
hadoop0: /opt/hbase1.2.6/conf/hbase-env.sh: line 52: export HBASE_HOME=/opt/hbase1.2.6: No such file or directory
hadoop0: /opt/hbase1.2.6/conf/hbase-env.sh: line 53: export HBASE_CLASSPATH=/opt/hadoop2.6.0/etc/hadoop: No such file or directory
hadoop0: /opt/hbase1.2.6/conf/hbase-env.sh: line 54: export HBASE_PID_DIR=/opt/hbase1.2.6/pids: No such file or directory
hadoop0: +======================================================================+
hadoop0: |                    Error: JAVA_HOME is not set                       |
hadoop0: +----------------------------------------------------------------------+
hadoop0: | Please download the latest Sun JDK from the Sun Java web site        |
hadoop0: |     > http://www.oracle.com/technetwork/java/javase/downloads        |
hadoop0: |                                                                      |
hadoop0: | HBase requires Java 1.7 or later.                                    |
hadoop0: +======================================================================+
hadoop2: /opt/hbase1.2.6/conf/hbase-env.sh: line 50: export JAVA_HOME=/opt/jdk1.8: No such file or directory
hadoop2: /opt/hbase1.2.6/conf/hbase-env.sh: line 51: export HADOOP_HOME=/opt/hadoop2.6.0: No such file or directory
hadoop2: /opt/hbase1.2.6/conf/hbase-env.sh: line 52: export HBASE_HOME=/opt/hbase1.2.6: No such file or directory
hadoop2: /opt/hbase1.2.6/conf/hbase-env.sh: line 53: export HBASE_CLASSPATH=/opt/hadoop2.6.0/etc/hadoop: No such file or directory
hadoop1: /opt/hbase1.2.6/conf/hbase-env.sh: line 50: export JAVA_HOME=/opt/jdk1.8: No such file or directory
hadoop1: /opt/hbase1.2.6/conf/hbase-env.sh: line 51: export HADOOP_HOME=/opt/hadoop2.6.0: No such file or directory
hadoop2: /opt/hbase1.2.6/conf/hbase-env.sh: line 54: export HBASE_PID_DIR=/opt/hbase1.2.6/pids: No such file or directory
hadoop1: /opt/hbase1.2.6/conf/hbase-env.sh: line 52: export HBASE_HOME=/opt/hbase1.2.6: No such file or directory
hadoop1: /opt/hbase1.2.6/conf/hbase-env.sh: line 53: export HBASE_CLASSPATH=/opt/hadoop2.6.0/etc/hadoop: No such file or directory
hadoop2: /opt/hbase1.2.6/conf/hbase-env.sh: line 55: $'export\302\240HBASE_MANAGES_ZK=false': command not found
hadoop1: /opt/hbase1.2.6/conf/hbase-env.sh: line 54: export HBASE_PID_DIR=/opt/hbase1.2.6/pids: No such file or directory
hadoop1: /opt/hbase1.2.6/conf/hbase-env.sh: line 55: $'export\302\240HBASE_MANAGES_ZK=false': command not found
hadoop2: +======================================================================+
hadoop2: |                    Error: JAVA_HOME is not set                       |
hadoop2: +----------------------------------------------------------------------+
hadoop2: | Please download the latest Sun JDK from the Sun Java web site        |
hadoop2: |     > http://www.oracle.com/technetwork/java/javase/downloads        |
hadoop2: |                                                                      |
hadoop2: | HBase requires Java 1.7 or later.                                    |
hadoop2: +======================================================================+
hadoop1: +======================================================================+
hadoop1: |                    Error: JAVA_HOME is not set                       |
hadoop1: +----------------------------------------------------------------------+
hadoop1: | Please download the latest Sun JDK from the Sun Java web site        |
hadoop1: |     > http://www.oracle.com/technetwork/java/javase/downloads        |
hadoop1: |                                                                      |
hadoop1: | HBase requires Java 1.7 or later.                                    |
hadoop1: +======================================================================+

解决方案

# 查看hbase-env.sh文件

 # cd /opt/hbase1.2.6/conf
 # vim hbase-env.sh

export HBASE_MANAGES_ZK=false
export JAVA_HOME="/opt/jdk1.8"
export HADOOP_HOME="/opt/hadoop2.6.0"
export HBASE_HOME="/opt/hbase1.2.6"
export HBASE_CLASSPATH="/opt/hadoop2.6.0/etc/hadoop"
export HBASE_PID_DIR="/opt/hbase1.2.6/pids"

# 上面这一块配置,全部重新手写,不要从网上复制过来直接黏贴,其中可能有中文字符,导致找不到环境变量

修改后直接在主节点启动Hbase

 # start-hbase.sh    

注:只需要在主节点上启动,从节点会自动全部开启hbase服务

这里写图片描述

这里写图片描述

这里写图片描述

  • 1
    点赞
  • 7
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
好的,关于在HBase数据库中使用Java创建表的方法如下: 首先,需要导入HBaseJava API相关的包: ```java import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.TableName; import org.apache.hadoop.hbase.client.Admin; import org.apache.hadoop.hbase.client.Connection; import org.apache.hadoop.hbase.client.ConnectionFactory; import org.apache.hadoop.hbase.client.TableDescriptor; import org.apache.hadoop.hbase.client.TableDescriptorBuilder; import org.apache.hadoop.hbase.io.compress.CompressionAlgorithm; import org.apache.hadoop.hbase.regionserver.BloomType; import org.apache.hadoop.hbase.util.Bytes; ``` 然后,通过以下代码建立与HBase数据库的连接: ```java Configuration config = HBaseConfiguration.create(); config.set("hbase.zookeeper.quorum", "localhost"); Connection connection = ConnectionFactory.createConnection(config); Admin admin = connection.getAdmin(); ``` 其中,"localhost"为HBase数据库所在的主机名。接下来,可以使用以下代码创建表: ```java String tableName = "myTable"; TableName name = TableName.valueOf(tableName); TableDescriptor tableDescriptor = TableDescriptorBuilder.newBuilder(name) .setColumnFamily(ColumnFamilyDescriptorBuilder.newBuilder(Bytes.toBytes("cf")) .setCompressionType(CompressionAlgorithm.GZ) .setBloomFilterType(BloomType.ROWCOL) .build()) .build(); admin.createTable(tableDescriptor); ``` 这里创建了一个名为"myTable"的表,并设置了一个名为"cf"的列族,其中设置了压缩算法和Bloom过滤器类型。最后,使用admin.createTable()方法创建表。 需要注意的是,在使用完后需要关闭与HBase数据库的连接: ```java admin.close(); connection.close(); ``` 以上就是使用JavaHBase数据库中创建表的方法,希望对你有所帮助。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值