windows编写web程序操作HDFS

计划编写spring程序实现windows下操作HDFS功能。其主要代码如下:

public class HdfsConnect {
// private final Map<Long, HdfsFrom> Config = new Hashtable<>();//存储user的数据结构
public FileSystem fs;

@RequestMapping(value = "hdfs/Opera", method = RequestMethod.GET)
    public String connectHdfs()
    {       
        return "hdfs/Opera";
    }

@RequestMapping(value ="hdfs/connect", method = RequestMethod.GET)
public String connectHdfs(Map<String, Object> model){
model.put("HdfsFrom", new HdfsFrom());
return "hdfs/connect";
}
// F:\hadoop\hadoop-2.5.2
@RequestMapping(value ="hdfs/connect", method = RequestMethod.POST)
public View connectHdfs(HdfsFrom hdfs) throws IOException{
System.setProperty("hadoop.home.dir", "F:\\hadoop\\hadoop-2.5.2");
Configuration conf = new Configuration();
conf.set("mapred.job.tracker", hdfs.getJobtracker());
conf.set("fs.default.name",hdfs.getNamenode());
fs = FileSystem.get(conf);
fs.mkdirs(new Path("/test"));
/*fs.copyFromLocalFile(new Path("C:\\Users\\DELL\\Documents\\申请格式.doc"), new Path("hdfs://192.168.1.121:9000/"));*/
System.currentTimeMillis();
return new RedirectView("/hdfs/Opera", true, false);
}
}

通过调试发现程序在fs.mkdir()处出现错误。错误信息为:

java.net.ConnectException: Call From DELL-PC/169.254.56.162 to Master:9000 failed on connection exception: java.net.ConnectException:

169.254.56.162该IP地址为Virtual Box的IP v4地址,将Virtual Box软件卸载,错误信息变成:

java.net.ConnectException: Call From DELL-PC/192.168.1.104 to Master:9000 failed on connection exception: java.net.ConnectException:

这时候,程序是通过本机IP发送请求去连接Hadoop集群,但是连接发生错误!具体的错误提示信息如下:

java.net.ConnectException: Call From DELL-PC/192.168.1.102 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
	sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
	java.lang.reflect.Constructor.newInstance(Unknown Source)
	org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
	org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
	org.apache.hadoop.ipc.Client.call(Client.java:1415)
	org.apache.hadoop.ipc.Client.call(Client.java:1364)
	org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
	com.sun.proxy.$Proxy49.mkdirs(Unknown Source)
	sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
	sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
	java.lang.reflect.Method.invoke(Unknown Source)
	org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
	org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
	com.sun.proxy.$Proxy49.mkdirs(Unknown Source)

发生该错误的原因是windows连接Hadoop集群出现问题。需要在hdfs-site.xml中设置

<property>  
<name>dfs.permissions</name>  
<value>false</value>  
</property>

重新跑一遍,仍然出现错误。最后在http://blog.csdn.net/u010536377/article/details/48949247博客上找到解决方案,只需要将core-site.xml中的fs.defaultFS中的localhost改成IP即成功完成web端操作HDFS的工作


评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值