Spring boot 与 hadoop整合

Spring boot 与 hadoop 整合

获取目录

HadoopTest.java

import org.apache.hadoop.fs.FileStatus;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.data.hadoop.fs.FsShell;

@SpringBootApplication
public class HadoopTest implements CommandLineRunner {

	@Autowired
	private FsShell shell;

	[@Override](https://my.oschina.net/u/1162528)
	public void run(String... args) {
		for (FileStatus s : shell.lsr("/")) {
			System.out.println("> " + s.getPath());
		}
	}

	public static void main(String[] args) {
		SpringApplication.run(HadoopTest.class, args);
	}
}

application.yml

spring:
  main:
	show_banner: false
  hadoop:
	fsUri: hdfs://172.16.100.91:8020

pom.xml

<dependency>
	<groupId>org.springframework.data</groupId>
	<artifactId>spring-data-hadoop</artifactId>
	<version>2.2.0.RELEASE</version>
	<exclusions>
		<exclusion>
			<groupId>org.springframework</groupId>
			<artifactId>spring-context-support</artifactId>
		</exclusion>
		<exclusion>
			<groupId>org.slf4j</groupId>
			<artifactId>slf4j-log4j12</artifactId>
		</exclusion>
		<exclusion>
			<groupId>org.apache.logging.log4j</groupId>
			<artifactId>log4j-to-slf4j</artifactId>
		</exclusion>
	</exclusions>
</dependency>
<dependency>
	<groupId>org.apache.hadoop</groupId>
	<artifactId>hadoop-client</artifactId>
	<version>2.7.0</version>
	<exclusions>
		<exclusion>
			<groupId>org.apache.logging.log4j</groupId>
			<artifactId>log4j-to-slf4j</artifactId>
		</exclusion>
	</exclusions>
</dependency>
<dependency>
	<groupId>org.apache.hadoop</groupId>
	<artifactId>hadoop-common</artifactId>
	<version>${hadoop.version}</version>
	<scope>compile</scope>
	<exclusions>
		<exclusion>
			<groupId>org.apache.logging.log4j</groupId>
			<artifactId>log4j-to-slf4j</artifactId>
		</exclusion>
	</exclusions>
</dependency>
<dependency>
	<groupId>org.springframework.data</groupId>
	<artifactId>spring-data-hadoop-boot</artifactId>
	<version>2.2.0.RELEASE</version>
</dependency>
<dependency>
	<groupId>org.apache.hadoop</groupId>
	<artifactId>hadoop-client</artifactId>
	<version>2.6.0-cdh5.7.0</version>
</dependency>

Windows下需要hadoop运行环境

添加环境变量

变量名:HADOOP_USER_NAME
变量值:root

变量名:HADOOP_HOME
变量值:D:\lwenhao\apache-hadoop-3.1.0-winutils-master

需要把HADOOP_HOME配置到PATH中

下载地址apache-hadoop-3.1.0-winutils-master

提取码:suik

转载于:https://my.oschina.net/lwenhao/blog/3025710

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值