maven-shade-plugin踩坑记

本文介绍如何使用maven-shade-plugin插件解决Kafka Consumer Java应用的打包问题,包括配置细节、常见错误及解决方案。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

准备做一个kafkaconsumer.jar命令行工具,采用maven-shade-plugin插件作为打包工具,只修改了main-class配置如下:

 <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.7.0</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                    <compilerVersion>1.8</compilerVersion>
                </configuration>
            </plugin>

            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>3.1.0</version>
                <configuration>
                    <createDependencyReducedPom>false</createDependencyReducedPom>
                    <filters>
                        <filter>
                            <artifact>*:*</artifact>
                            <excludes>
                                <exclude>META-INF/*.SF</exclude>
                                <exclude>META-INF/*.DSA</exclude>
                                <exclude>META-INF/*.RSA</exclude>
                            </excludes>
                        </filter>
                    </filters>
                    <minimizeJar>true</minimizeJar>
                </configuration>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                        <configuration>
                            <transformers>
                                <transformer
                                        implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
                                <transformer
                                        implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                    <mainClass>com.sanwishe.ConsumerBootStrap</mainClass>
                                </transformer>
                            </transformers>
                        </configuration>
                    </execution>
                </executions>
            </plugin>


        </plugins>
    </build>

首先在idea上直接测试,一切ok。。。。

然后,打包,发布,上线测试,,,

kafkaconsumer java -jar target/kafka-consumer-1.0-SNAPSHOT.jar -z localhost:9092 -t default
kafka bootstrap server:127.0.0.1:9092
topic to be consumed: default
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" org.apache.kafka.common.config.ConfigException: Invalid value org.apache.kafka.common.serialization.IntegerDeserializer for configuration key.deserializer: Class org.apache.kafka.common.serialization.IntegerDeserializer could not be found.
        at org.apache.kafka.common.config.ConfigDef.parseType(ConfigDef.java:715)
        at org.apache.kafka.common.config.ConfigDef.parseValue(ConfigDef.java:460)
        at org.apache.kafka.common.config.ConfigDef.parse(ConfigDef.java:453)
        at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:62)
        at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:75)
        at org.apache.kafka.clients.consumer.ConsumerConfig.<init>(ConsumerConfig.java:481)
        at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:635)
        at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:617)
        at com.sanwishe.MyConsumer.getConsumer(MyConsumer.java:51)
        at com.sanwishe.MyConsumer.run(MyConsumer.java:22)
        at com.sanwishe.ConsumerBootStrap.main(ConsumerBootStrap.java:52)

what???
检查StringDeserializer,额,确实是kafka-clients包中的,即然找不到StringDeserializer.class,就去target路径下看看,却是没找到???
到这里可能是打包没有将StringDeserializer所在jar包一起include进去,回头看了下maven打包日志打印:

[INFO] --- maven-shade-plugin:3.1.0:shade (default) @ kafka-consumer ---
[INFO] Including org.apache.kafka:kafka-clients:jar:1.0.0 in the shaded jar.
[INFO] Including org.lz4:lz4-java:jar:1.4 in the shaded jar.
[INFO] Including org.xerial.snappy:snappy-java:jar:1.1.4 in the shaded jar.
[INFO] Including org.slf4j:slf4j-api:jar:1.7.25 in the shaded jar.
[INFO] Including commons-cli:commons-cli:jar:1.4 in the shaded jar.
[INFO] Minimizing jar com.sanwishe:kafka-consumer:jar:1.0-SNAPSHOT
[INFO] Minimized 1219 -> 847 (69%)
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing /Users/jmz/IdeaProjects/kafkaconsumer/target/kafka-consumer-1.0-SNAPSHOT.jar with /Users/jmz/IdeaProjects/kafkaconsumer/target/kafka-consumer-1.0-SNAPSHOT-shaded.jar

看到这里,大概明白原因了,再回头看看pom配置中<minimizeJar>true</minimizeJar>,翻阅maven-shade-plugin文档
这里写图片描述
OK,去除这个配置,编译打包测试OK。

问题很简单,然后顺便学习下maven-shade-plugin。

  • createDependencyReducedPom
    用来标识是否为当前artifacts创建缩减的pom,为true时,它会把你的pom中的dependency干掉,并且将它们放在一个名为dependency-reduced-pom.xml的临时文件中。默认为true,一般需要设置为false
  • createSourcesJar
    是否创建sourceJar,一般不需要配置。

  • filters
    通过一些列的include/exclude来指定那些内容需要添加到jar包中去。逻辑上来说,include会在exclude前生效。所以可以使用include收集内容,使用exclude来排除内容。如果不配置filters,默认所有的内容都会include。一个artifact可配置多个filter,最终的内容为多个filter的交集。
    对于有多个第三方依赖的情况,第三方包 META-INF下会有一些相同的MINIFEST文件,maven-shade-plugin默认会使用追加而不是覆盖的方式处理它们。由于一些包重复引用,打包后的 META-INF 目录多出了一些 *.SF 等文件,执行jar包时,可能会抛出java.lang.SecurityException: Invalid signature file digest for Manifest main attributes异常。可以通过配置下面的filters来避免问题:

<configuration>
    <filters>
          <filter>
              <artifact>*:*</artifact>
              <excludes>
                    <exclude>META-INF/*.SF</exclude>
                    <exclude>META-INF/*.DSA</exclude>
                    <exclude>META-INF/*.RSA</exclude>
              </excludes>
          </filter>
    </filters>
</configuration>
  • 指定main-class
<executions>
    <execution>
        <phase>package</phase>
            <goals>
                <goal>shade</goal>
            </goals>
            <configuration>
                <transformers>
                    <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
                    <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                <mainClass>com.sanwishe.ConsumerBootStrap</mainClass>
                    </transformer>
               </transformers>
          </configuration>
     </execution>
</executions>
  • finalName
    用来指定artifact名称。如:
<executions>
    <execution>
        <phase>package</phase>
            <goals>
                <goal>shade</goal>
            </goals>
            <configuration>
                 <finalName>myKafkaConsumer</finalName>
                 <transformers>
                     <!--transformer>...</transformer-->
                 </transformers>
            </configuration>
       </execution>
</executions>

最终的target路径生产文件如下:
这里写图片描述
其中可执行jar是myKafkaConsumer.jar

others to be contiuned。

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值