window本地用eclipse调用api操作hadoop集群

1、下载ecplice安装
2、选择hadoop2.8.5下载到window本地
3、在eclipse中新建java工程
4、添加hadoop中的jar包到java工程
右键工程->Build Path->Add Libraries->User Library
在这里插入图片描述

在这里插入图片描述

在这里插入图片描述

在这里插入图片描述

在这里插入图片描述

依赖jar包都在hadoop2.8.5文件夹中的share目录中,选择其中的
share/hadoop/common/hadoop-common-2.8.5.jar
share/hadoop/common/lib/*
share/hadoop/hdfs/hadoop-hdfs-2.8.5.jar
share/hadoop/hdfs/lib/*
这样java工程中的hadoop依赖就添加成功了。

5、操作hadoop的java api
设置集群的一些参数
比如设置block的副本数及block的大小
上传文件

public class HdfsClientDemo01 {
	public static void main(String[] args) throws IOException, InterruptedException, URISyntaxException {
		//1.客户端加载配置文件
		Configuration conf = new Configuration();
		
		//2.指定配置(设置成两个副本数)
		conf.set("dfs.replication","2");
		
		//3.指定块大小
		conf.set("dfs.blocksize", "64m");
		
		//4.构造客户端
		FileSystem fs = FileSystem.get(new URI("hdfs://192.168.252.121:9000/"),conf,"root");
		
		//5.上传文件
		fs.copyFromLocalFile(new Path("e:/words.txt"), new Path("/words.txt"));
		//6.关闭资源
		fs.close();
	}
}

6、操作hadoop的java api下载hdfs文件系统中文件

public class HdfsClientDemo02 {

	public static void main(String[] args) throws IOException, InterruptedException, URISyntaxException {
		//1.加载配置
		Configuration conf = new Configuration();
		
		//2.设置副本数
		conf.set("dfs.replication", "2");
		
		//3.设置块大小
		conf.set("dfs.blocksize", "64m");
		
		//4.构造客户端
		FileSystem fs = FileSystem.get(new URI("hdfs://192.168.252.121:9000/"),conf,"root");
		
		//5.hdfs数据下载到window本地
		fs.copyToLocalFile(new Path("/a.txt"), new Path("e:/aa.txt"));
		
		//6.关闭资源
		fs.close();
	}

}

注意这里如果直接执行下载会报错,错误如下

Exception in thread “main” java.lang.RuntimeException: java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:716)
at org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:250)
at org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:267)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:771)
at org.apache.hadoop.fs.RawLocalFileSystem L o c a l F S F i l e O u t p u t S t r e a m . &lt; i n i t &gt; ( R a w L o c a l F i l e S y s t e m . j a v a : 237 ) a t o r g . a p a c h e . h a d o o p . f s . R a w L o c a l F i l e S y s t e m LocalFSFileOutputStream.&lt;init&gt;(RawLocalFileSystem.java:237) at org.apache.hadoop.fs.RawLocalFileSystem LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:237)atorg.apache.hadoop.fs.RawLocalFileSystemLocalFSFileOutputStream.(RawLocalFileSystem.java:221)
at org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:319)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:307)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:339)
at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.(ChecksumFileSystem.java:399)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:462)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:441)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:929)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:910)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:807)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:368)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:341)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:292)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2067)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2036)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2012)
at com.tony.hdfs01.HdfsClientDemo02.main(HdfsClientDemo02.java:27)

下载文件,会在window本地寻找c语言的库也就是hadoop的安装包,如果找不到就报错,这个错误的原因是没有在window中配置hadoop的环境变量
在环境变量中新建
HADOOP_HOME=E:\bigdata\hadoop-2.8.5
在已有的PATH环境变量中添加%HADOOP_HOME%\bin

配置好环境变量后要退出eclipse,重启eclipse才能生效。

然后启动发现还是会报错,错误如下:

Exception in thread “main” java.lang.RuntimeException: java.io.FileNotFoundException: Could not locate Hadoop executable: E:\bigdata\hadoop-2.8.5\bin\winutils.exe -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:716)
at org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:250)
at org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:267)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:771)
at org.apache.hadoop.fs.RawLocalFileSystem L o c a l F S F i l e O u t p u t S t r e a m . &lt; i n i t &gt; ( R a w L o c a l F i l e S y s t e m . j a v a : 237 ) a t o r g . a p a c h e . h a d o o p . f s . R a w L o c a l F i l e S y s t e m LocalFSFileOutputStream.&lt;init&gt;(RawLocalFileSystem.java:237) at org.apache.hadoop.fs.RawLocalFileSystem LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:237)atorg.apache.hadoop.fs.RawLocalFileSystemLocalFSFileOutputStream.(RawLocalFileSystem.java:221)
at org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:319)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:307)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:339)
at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.(ChecksumFileSystem.java:399)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:462)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:441)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:929)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:910)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:807)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:368)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:341)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:292)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2067)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2036)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2012)
at com.tony.hdfs01.HdfsClientDemo02.main(HdfsClientDemo02.java:27)

这个错误是因为缺少winutils.exe导致的,把对应hadoop版本的winutils.exe添加到hadoop的bin目录下就解决了。winutils是编译好的window环境

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值