Kafka和Spark Streaming Java版本集成并将数据实时写入HBase

本文介绍了如何使用Java将Kafka与Spark Streaming集成,并将处理后的数据实时写入HBase数据库。首先,需要在pom.xml中配置相关依赖,然后通过mvn package打包成spark-1-jar-with-dependencies.jar,最后将该jar文件上传到Spark集群进行运行。此外,提供了源代码下载链接供参考。
摘要由CSDN通过智能技术生成


Kafka和Spark Streaming Java版本集成并将数据实时写入HBase

mvn配置pom.xml

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<groupId>spaek</groupId>
	<artifactId>spark</artifactId>
	<version>1</version>
	<packaging>jar</packaging>

	<properties>
		<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
	</properties>
	<dependencies>

		<dependency>
			<groupId>org.apache.spark</groupId>
			<artifactId>spark-streaming_2.10</artifactId>
			<version>1.2.0</version>
			<scope>provided</scope>
		</dependency>
		<dependency>
			<groupId>org.apache.spark</groupId>
			<artifactId>spark-streaming-kafka_2.10</artifactId>
			<version>1.2.0</version>
		</dependency>
		<dependency>
			<groupId>org.clojure</groupId>
			<artifactId>clojure</artifactId>
			<version>1.6.0</version>
		</dependency>
		<dependency>
			<groupId>com.google.guava</groupId>
			<artifactId>guava</artifactId>
			<version>11.0.2</version>
		</dependency>
		<dependency>
			<groupId>org.apache.hbase</groupId>
			<artifactId>hbase-client</artifactId>
			<version>0.98.4-hadoop2</version>
		</dependency>
		<dependency>
			<groupId>com.google.protobuf</groupId>
			<artifactId>protobuf-java</artifactId>
			<version>2.5.0</version>
		</dependency>
		<dependency>
			<groupId>io.netty</groupId>
			<artifactId>netty</artifactId>
			<version>3.6.6.Final</version>
		</dependency>
		<dependency>
			<groupId>org.apache.hbase</groupId>
			<artifactId>hbase-common</artifactId>
			<version>0.98.4-hadoop2</version>
		</dependency>
		<dependency>
			<groupId>org.apache.hbase</groupId>
			<artifactId>hbase-protocol</artifactId>
			<version>0.98.4-hadoop2</version>
		</dependency>
		<dependency>
			<groupId>org.apache.zookeeper</groupId>
			<artifactId>zookeeper</artifactId>
			<version>3.4.5</version>
		</dependency>
		<dependency>
			<groupId>org.cloudera.htrace</groupId>
			<artifactId>htrace-core</artifactId>
			<version>2.01</version>
		</dependency>
	</dependencies>

	<build>
		<plugins>
			<!-- Bind the maven-assembly-plugin to the package phase this will create 
				a jar file without the storm dependencies suitable for deployment to a cluster. -->
			<plugin>
				<artifactId>maven-assembly-plugin</artifactId>
				<configuration>
					<descriptorRefs>
						<descriptorRef>jar-with-dependencies</descriptorRef>
					</descriptorRefs>
					<archive>
						<manifest>
							<mainClass />
						</manifest>
					</archive>
				</configuration>
				<executions>
					<execution>
						<id>make-assembly</id>
						<phase>package</phase>
						<goals>
							<goal>single</goal>
						</goals>
		
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值