java程序查看hdfs文件属性,使用Java在HDFS中访问文件

I am trying to access a file in the HDFS using Java APIs, but everytime I am getting File Not Found. Code which I am using to access is :-

Configuration conf = new Configuration();

conf.addResource(FileUtilConstants.ENV_HADOOP_HOME + FileUtilConstants.REL_PATH_CORE_SITE);

conf.addResource(FileUtilConstants.ENV_HADOOP_HOME + FileUtilConstants.REL_PATH_HDFS_SITE);

try {

FileSystem fs = FileSystem.get(conf);

Path hdfsfilePath = new Path(hdfsPath);

logger.info("Filesystem URI : " + fs.getUri());

logger.info("Filesystem Home Directory : " + fs.getHomeDirectory());

logger.info("Filesystem Working Directory : " + fs.getWorkingDirectory());

logger.info("HDFS File Path : " + hdfsfilePath);

if (!fs.exists(hdfsfilePath)) {

logger.error("File does not exists : " + hdfsPath);

}

And here is the command line output from the code.

[root@koversevms ~]# java -jar /tmp/thetus-incendiary-koverse-extension-fileutils-1.0-SNAPSHOT.jar

13/07/10 02:47:18 INFO fileutils.HadoopFileChecksumUtils: Filesystem URI : file:///

13/07/10 02:47:18 INFO fileutils.HadoopFileChecksumUtils: Filesystem Home Directory : file:/root

13/07/10 02:47:18 INFO fileutils.HadoopFileChecksumUtils: Filesystem Working Directory : file:/root

13/07/10 02:47:18 INFO fileutils.HadoopFileChecksumUtils: HDFS File Path : /usr/hadoop/sample/sample.txt

13/07/10 02:47:18 ERROR fileutils.HadoopFileChecksumUtils: File does not exists : /usr/hadoop/sample/sample.txt

I am new to hadoop so I don't know what is going wrong.

Thanks,

Nayan

解决方案

Here is code fragment originally posted in context of answer to this question. It should solve your question too despite intention of original question was different. Main point in your code is you have issues starting from scheme (file://). Please check fs.defaultFS variable in your configuration.

package org.myorg;

import java.security.PrivilegedExceptionAction;

import org.apache.hadoop.conf.*;

import org.apache.hadoop.security.UserGroupInformation;

import org.apache.hadoop.fs.Path;

import org.apache.hadoop.fs.FileSystem;

import org.apache.hadoop.fs.FileStatus;

public class HdfsTest {

public static void main(String args[]) {

try {

UserGroupInformation ugi

= UserGroupInformation.createRemoteUser("hbase");

ugi.doAs(new PrivilegedExceptionAction() {

public Void run() throws Exception {

Configuration conf = new Configuration();

conf.set("fs.defaultFS", "hdfs://1.2.3.4:8020/user/hbase");

conf.set("hadoop.job.ugi", "hbase");

FileSystem fs = FileSystem.get(conf);

fs.createNewFile(new Path("/user/hbase/test"));

FileStatus[] status = fs.listStatus(new Path("/user/hbase"));

for(int i=0;i

System.out.println(status[i].getPath());

}

return null;

}

});

} catch (Exception e) {

e.printStackTrace();

}

}

}

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值