准备做一个kafkaconsumer.jar命令行工具,采用maven-shade-plugin插件作为打包工具,只修改了main-class
配置如下:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.7.0</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<compilerVersion>1.8</compilerVersion>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.0</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<minimizeJar>true</minimizeJar>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.sanwishe.ConsumerBootStrap</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
首先在idea上直接测试,一切ok。。。。
然后,打包,发布,上线测试,,,
kafkaconsumer java -jar target/kafka-consumer-1.0-SNAPSHOT.jar -z localhost:9092 -t default
kafka bootstrap server:127.0.0.1:9092
topic to be consumed: default
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" org.apache.kafka.common.config.ConfigException: Invalid value org.apache.kafka.common.serialization.IntegerDeserializer for configuration key.deserializer: Class org.apache.kafka.common.serialization.IntegerDeserializer could not be found.
at org.apache.kafka.common.config.ConfigDef.parseType(ConfigDef.java:715)
at org.apache.kafka.common.config.ConfigDef.parseValue(ConfigDef.java:460)
at org.apache.kafka.common.config.ConfigDef.parse(ConfigDef.java:453)
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:62)
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:75)
at org.apache.kafka.clients.consumer.ConsumerConfig.<init>(ConsumerConfig.java:481)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:635)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:617)
at com.sanwishe.MyConsumer.getConsumer(MyConsumer.java:51)
at com.sanwishe.MyConsumer.run(MyConsumer.java:22)
at com.sanwishe.ConsumerBootStrap.main(ConsumerBootStrap.java:52)
what???
检查StringDeserializer,额,确实是kafka-clients包中的,即然找不到StringDeserializer.class,就去target路径下看看,却是没找到???
到这里可能是打包没有将StringDeserializer所在jar包一起include进去,回头看了下maven打包日志打印:
[INFO] --- maven-shade-plugin:3.1.0:shade (default) @ kafka-consumer ---
[INFO] Including org.apache.kafka:kafka-clients:jar:1.0.0 in the shaded jar.
[INFO] Including org.lz4:lz4-java:jar:1.4 in the shaded jar.
[INFO] Including org.xerial.snappy:snappy-java:jar:1.1.4 in the shaded jar.
[INFO] Including org.slf4j:slf4j-api:jar:1.7.25 in the shaded jar.
[INFO] Including commons-cli:commons-cli:jar:1.4 in the shaded jar.
[INFO] Minimizing jar com.sanwishe:kafka-consumer:jar:1.0-SNAPSHOT
[INFO] Minimized 1219 -> 847 (69%)
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing /Users/jmz/IdeaProjects/kafkaconsumer/target/kafka-consumer-1.0-SNAPSHOT.jar with /Users/jmz/IdeaProjects/kafkaconsumer/target/kafka-consumer-1.0-SNAPSHOT-shaded.jar
看到这里,大概明白原因了,再回头看看pom配置中<minimizeJar>true</minimizeJar>
,翻阅maven-shade-plugin文档
OK,去除这个配置,编译打包测试OK。
问题很简单,然后顺便学习下maven-shade-plugin。
- createDependencyReducedPom
用来标识是否为当前artifacts创建缩减的pom,为true
时,它会把你的pom中的dependency干掉,并且将它们放在一个名为dependency-reduced-pom.xml
的临时文件中。默认为true
,一般需要设置为false
。 createSourcesJar
是否创建sourceJar,一般不需要配置。filters
通过一些列的include/exclude
来指定那些内容需要添加到jar包中去。逻辑上来说,include
会在exclude
前生效。所以可以使用include
收集内容,使用exclude
来排除内容。如果不配置filters
,默认所有的内容都会include
。一个artifact可配置多个filter,最终的内容为多个filter的交集。
对于有多个第三方依赖的情况,第三方包 META-INF下会有一些相同的MINIFEST
文件,maven-shade-plugin默认会使用追加而不是覆盖的方式处理它们。由于一些包重复引用,打包后的 META-INF 目录多出了一些 *.SF 等文件,执行jar包时,可能会抛出java.lang.SecurityException: Invalid signature file digest for Manifest main attributes
异常。可以通过配置下面的filters
来避免问题:
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
- 指定main-class
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.sanwishe.ConsumerBootStrap</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
- finalName
用来指定artifact名称。如:
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<finalName>myKafkaConsumer</finalName>
<transformers>
<!--transformer>...</transformer-->
</transformers>
</configuration>
</execution>
</executions>
最终的target路径生产文件如下:
其中可执行jar是myKafkaConsumer.jar
。
others to be contiuned。