maven基础教程

maven指南

@(OTHERS)[maven]

1、maven下载出错

有时候maven编译或者打包时下载包的阶段出错,一般都是网络问题,相信自己,你没写错。如:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test (default-test) on project myusml: Execution default-test of goal org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test failed: Plugin org.apache.maven.plugins:maven-surefire-plugin:2.18.1 or one of its dependencies could not be resolved: Failed to collect dependencies at org.apache.maven.plugins:maven-surefire-plugin:jar:2.18.1 -> org.apache.maven.surefire:maven-surefire-common:jar:2.18.1: Failed to read artifact descriptor for org.apache.maven.surefire:maven-surefire-common:jar:2.18.1: Could not transfer artifact org.apache.maven.surefire:maven-surefire-common:pom:2.18.1 from/to central (https://repo.maven.apache.org/maven2): Remote host closed connection during handshake: SSL peer shut down incorrectly -> [Help 1]

国内访问某些外国网站的网络很不稳定,因此需要添加国内的中央库:

1、修改$M2_HOME/conf/setting.xml中添加国内的源
(1)配置 mirror

<mirrors>
...
        <mirror>
            <id>osc</id>
            <mirrorOf>central</mirrorOf>
            <url>http://maven.oschina.net/content/groups/public/</url>
        </mirror>
        <mirror>
            <id>osc_thirdparty</id>
            <mirrorOf>thirdparty</mirrorOf>
            <url>http://maven.oschina.net/content/repositories/thirdparty/</url>
        </mirror>
...
</mirrors>

(2)配置 profile

<profiles>
...
        <profile>
            <id>osc</id>
            <activation>
                <activeByDefault>true</activeByDefault>
            </activation>
            <repositories>
                <repository>
                    <id>osc</id>
                    <url>http://maven.oschina.net/content/groups/public/</url>
                </repository>
                <repository>
                    <id>osc_thirdparty</id>
                    <url>http://maven.oschina.net/content/repositories/thirdparty/</url>
                </repository>
            </repositories>
            <pluginRepositories>
                <pluginRepository>
                    <id>osc</id>
                    <url>http://maven.oschina.net/content/groups/public/</url>
                </pluginRepository>
            </pluginRepositories>
        </profile>
...
</profiles>

详见http://maven.oschina.net/help.html

2、出现以下错误:
Description Resource Path Location Type Could not calculate build plan: Failure to transfer org.apache.maven.plugins:maven-compiler-plugin:pom:2.0.2

执行:
find ~/.m2 -name “*.lastUpdated” -exec grep -q “Could not transfer” {} \; -print -exec rm {} \;
原因是之前有一些下载失败了。
详见http://stackoverflow.com/questions/5074063/maven-error-failure-to-transfer

3、配置eclipse使用本地安装的maven,而不是自带的版本。同时在User setting里面,也要指定setting.xml是使用哪个配置文件,从而指定国内的库。
否则国内有时候连不上maven中央库,导致出现以下异常:

Description Resource    Path    Location    Type
Failed to read artifact descriptor for org.apache.storm:storm-core:jar:0.10.0

org.eclipse.aether.resolution.ArtifactDescriptorException: Failed to read artifact descriptor for org.apache.storm:storm-core:jar:0.10.0
    at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.loadPom(DefaultArtifactDescriptorReader.java:302)
    at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.readArtifactDescriptor(DefaultArtifactDescriptorReader.java:218)
    at org.eclipse.aether.internal.impl.DefaultDependencyCollector.resolveCachedArtifactDescriptor(DefaultDependencyCollector.java:535)
    at org.eclipse.aether.internal.impl.DefaultDependencyCollector.getArtifactDescriptorResult(DefaultDependencyCollector.java:519)
    at org.eclipse.aether.internal.impl.DefaultDependencyCollector.processDependency(DefaultDependencyCollector.java:409)
    at org.eclipse.aether.internal.impl.DefaultDependencyCollector.processDependency(DefaultDependencyCollector.java:363)
    at org.eclipse.aether.internal.impl.DefaultDependencyCollector.process(DefaultDependencyCollector.java:351)
    at org.eclipse.aether.internal.impl.DefaultDependencyCollector.collectDependencies(DefaultDependencyCollector.java:254)
    at org.eclipse.aether.internal.impl.DefaultRepositorySystem.collectDependencies(DefaultRepositorySystem.java:316)
    at org.apache.maven.project.DefaultProjectDependenciesResolver.resolve(DefaultProjectDependenciesResolver.java:172)
    at org.apache.maven.project.DefaultProjectBuilder.resolveDependencies(DefaultProjectBuilder.java:215)
    at org.apache.maven.project.DefaultProjectBuilder.build(DefaultProjectBuilder.java:188)
    at org.apache.maven.project.DefaultProjectBuilder.build(DefaultProjectBuilder.java:119)
    at org.eclipse.m2e.core.internal.embedder.MavenImpl.readMavenProject(MavenImpl.java:636)
    at org.eclipse.m2e.core.internal.project.registry.DefaultMavenDependencyResolver.resolveProjectDependencies(DefaultMavenDependencyResolver.java:63)
    at org.eclipse.m2e.core.internal.project.registry.ProjectRegistryManager.refreshPhase2(ProjectRegistryManager.java:529)
    at org.eclipse.m2e.core.internal.project.registry.ProjectRegistryManager$3.call(ProjectRegistryManager.java:491)
    at org.eclipse.m2e.core.internal.project.registry.ProjectRegistryManager$3.call(ProjectRegistryManager.java:1)
    at org.eclipse.m2e.core.internal.embedder.MavenExecutionContext.executeBare(MavenExecutionContext.java:176)
    at org.eclipse.m2e.core.internal.embedder.MavenExecutionContext.execute(MavenExecutionContext.java:151)
    at org.eclipse.m2e.core.internal.project.registry.ProjectRegistryManager.refresh(ProjectRegistryManager.java:495)
    at org.eclipse.m2e.core.internal.project.registry.ProjectRegistryManager.refresh(ProjectRegistryManager.java:350)
    at org.eclipse.m2e.core.internal.project.registry.ProjectRegistryManager.refresh(ProjectRegistryManager.java:297)
    at org.eclipse.m2e.core.internal.project.ProjectConfigurationManager.updateProjectConfiguration0(ProjectConfigurationManager.java:398)
    at org.eclipse.m2e.core.internal.project.ProjectConfigurationManager$2.call(ProjectConfigurationManager.java:345)
    at org.eclipse.m2e.core.internal.project.ProjectConfigurationManager$2.call(ProjectConfigurationManager.java:1)
    at org.eclipse.m2e.core.internal.embedder.MavenExecutionContext.executeBare(MavenExecutionContext.java:176)
    at org.eclipse.m2e.core.internal.embedder.MavenExecutionContext.execute(MavenExecutionContext.java:151)
    at org.eclipse.m2e.core.internal.embedder.MavenExecutionContext.execute(MavenExecutionContext.java:99)
    at org.eclipse.m2e.core.internal.embedder.MavenImpl.execute(MavenImpl.java:1351)
    at org.eclipse.m2e.core.internal.project.ProjectConfigurationManager.updateProjectConfiguration(ProjectConfigurationManager.java:342)
    at org.eclipse.m2e.core.ui.internal.UpdateMavenProjectJob.runInWorkspace(UpdateMavenProjectJob.java:77)
    at org.eclipse.core.internal.resources.InternalWorkspaceJob.run(InternalWorkspaceJob.java:38)
    at org.eclipse.core.internal.jobs.Worker.run(Worker.java:54)
Caused by: org.eclipse.aether.resolution.ArtifactResolutionException: Failure to transfer org.apache.storm:storm-core:pom:0.10.0 from https://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced. Original error: Could not transfer artifact org.apache.storm:storm-core:pom:0.10.0 from/to central (https://repo.maven.apache.org/maven2): Remote host closed connection during handshake
    at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:444)
    at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifacts(DefaultArtifactResolver.java:246)
    at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifact(DefaultArtifactResolver.java:223)
    at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.loadPom(DefaultArtifactDescriptorReader.java:287)
    ... 33 more
Caused by: org.eclipse.aether.transfer.ArtifactTransferException: Failure to transfer org.apache.storm:storm-core:pom:0.10.0 from https://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced. Original error: Could not transfer artifact org.apache.storm:storm-core:pom:0.10.0 from/to central (https://repo.maven.apache.org/maven2): Remote host closed connection during handshake
    at org.eclipse.aether.internal.impl.DefaultUpdateCheckManager.newException(DefaultUpdateCheckManager.java:238)
    at org.eclipse.aether.internal.impl.DefaultUpdateCheckManager.checkArtifact(DefaultUpdateCheckManager.java:206)
    at org.eclipse.aether.internal.impl.DefaultArtifactResolver.gatherDownloads(DefaultArtifactResolver.java:585)
    at org.eclipse.aether.internal.impl.DefaultArtifactResolver.performDownloads(DefaultArtifactResolver.java:503)
    at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:421)
    ... 36 more
    pom.xml /stormfilter    line 1  Maven Dependency Problem

如果还不行,先到命令行执行mvn clean package,看原因是什么,很有可能是连接中央库失败。

2、打包超级包:shade与provided

默认情况下,maven package只会将用户的代码打包,它会认为运行机器的classpath中存在它所需要的依赖包。但有时候需要将项目的所有依赖包都打包到发布的jar包中,此时可以通过以下的plugin实现:

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>2.3</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                </execution>
            </executions>
            <configuration>
                <filters>
                    <filter>
                        <artifact>*:*</artifact>
                        <excludes>
                            <exclude>META-INF/*.SF</exclude>
                            <exclude>META-INF/*.DSA</exclude>
                            <exclude>META-INF/*.RSA</exclude>
                        </excludes>
                    </filter>
                </filters>
            </configuration>
        </plugin>
    </plugins>
</build>

如果有一部分依赖包不想打包进jar包中,可以指定其为provided。比如运行一个spark程序时,classpath中必然会加载spark的包,此时就可以将spark指定为provided。

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.5.1</version>
        <scope>provided</scope>
    </dependency>

使用这个plugin有可能出现以下错误:

2016-08-11 10:08:54,902 ERROR org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost: Failed to load coprocessor com.lujinhong.hbase.solr.indexer.SolrIndexerObserver
java.lang.SecurityException: Invalid signature file digest for Manifest main attributes
    at sun.security.util.SignatureFileVerifier.processImpl(SignatureFileVerifier.java:286)
    at sun.security.util.SignatureFileVerifier.process(SignatureFileVerifier.java:239)
    at java.util.jar.JarVerifier.processEntry(JarVerifier.java:317)
    at java.util.jar.JarVerifier.update(JarVerifier.java:228)
    at java.util.jar.JarFile.initializeVerifier(JarFile.java:348)
    at java.util.jar.JarFile.getInputStream(JarFile.java:415)
    at sun.misc.URLClassPath$JarLoader$2.getInputStream(URLClassPath.java:775)
    at sun.misc.Resource.cachedInputStream(Resource.java:77)
    at sun.misc.Resource.getByteBuffer(Resource.java:160)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:436)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at org.apache.hadoop.hbase.util.CoprocessorClassLoader.loadClass(CoprocessorClassLoader.java:292)
    at org.apache.hadoop.hbase.coprocessor.CoprocessorHost.load(CoprocessorHost.java:183)
    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.loadTableCoprocessors(RegionCoprocessorHost.java:326)
    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.<init>(RegionCoprocessorHost.java:225)
    at org.apache.hadoop.hbase.regionserver.HRegion.<init>(HRegion.java:773)
    at org.apache.hadoop.hbase.regionserver.HRegion.<init>(HRegion.java:681)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.hadoop.hbase.regionserver.HRegion.newHRegion(HRegion.java:5677)
    at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:5987)
    at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:5959)
    at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:5915)
    at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:5866)
    at org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.openRegion(OpenRegionHandler.java:356)
    at org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.process(OpenRegionHandler.java:126)
    at org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:128)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)

解决方法见:
http://stackoverflow.com/questions/999489/invalid-signature-file-when-attempting-to-run-a-jar

即在plugin中添加以下内容【上面例子中已经包括了】:

<configuration>
    <filters>
        <filter>
            <artifact>*:*</artifact>
            <excludes>
                <exclude>META-INF/*.SF</exclude>
                <exclude>META-INF/*.DSA</exclude>
                <exclude>META-INF/*.RSA</exclude>
            </excludes>
        </filter>
    </filters>
    <!-- Additional configuration. -->
</configuration>

3、依赖冲突:NoSuchMethodError, ClassNotFoundException

如果在运行中出现NoSuchMethodError或者ClassNotFoundException的异常,而你又确定classpath中存在这个包,那是因为你的classpath中存在2个不同版本的jar包了,比如常见的log4j,你在classpath中添加了log4j.jar,而spark的lib目录中也有log4j.jar,而且这2个jar包版本不一致的话,就会出现这种问题。

解决办法有2种:
(1)修改程序,让你的代码使用spark(比如)提供的jar包的版本。
(2)使用称为shading的方式修改你的应用。使用上面提到的maven-shade-plugin插件进行高级配置来支持这种打包方式。shading可以让你以另一种命名空间保留冲突的包,并自动重写应用的代码使得它们使用重命名后的版本。这种技术有些简单粗暴,不过对于解决运行时依赖冲突的问题非常有效。

另一个例子,storm使用了log4j2.x,而hbase使用了1.x,因此在storm项目中导入hbase需要将其exclude掉:

    <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase-client</artifactId>
        <version>1.0.0</version>
        <exclusions>
            <exclusion>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-api</artifactId>
            </exclusion>
            <exclusion>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-log4j12</artifactId>
            </exclusion>
            <exclusion>
                <groupId>log4j</groupId>
                <artifactId>log4j</artifactId>
            </exclusion>
        </exclusions>
    </dependency>

只要log4j有类似NoClassDefindFound之类的冲突都这样解决。

4、添加CDH的依赖

在pom.xml中增加以下内容:

<repositories>
        <repository>
            <id>cloudera</id>
            <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
        </repository>
    </repositories>

然后指定CDH的版本:

    <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase-client</artifactId>
        <version>1.0.0-cdh5.4.5</version>
    </dependency>

版本号可以从https://repository.cloudera.com/artifactory/cloudera-repos/查到。

5、忽略测试

-DskipTests,不执行测试用例,但编译测试用例类生成相应的class文件至target/test-classes下。

-Dmaven.test.skip=true,不执行测试用例,也不编译测试用例类。

6、调整jdk版本

<profiles>标签中增加以下内容

<profile>
  <properties>
      <maven.compiler.source>1.8</maven.compiler.source>
      <maven.compiler.target>1.8</maven.compiler.target>
      <maven.compiler.compilerVersion>1.8</maven.compiler.compilerVersion>
  </properties>
  </profile>

如果不指定这些参数,则会使用$JAVA_HOME定义的jdk版本。所以也可以通过改变$JAVA_HOME来改变行为

或者在pom.xml中添加以下内容:

<build>
  <plugins>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-compiler-plugin</artifactId>
      <configuration>
        <source>1.7</source>
        <target>1.7</target>
      </configuration>
    </plugin>
  </plugins0>
</build>

7、SecurityException: Invalid signature file digest for Manifest main attribute

见第2点shade与超级包里面有一种更好的解决方案。
使用maven打包(尤其是uber包)时,可能出现以下错误:

Exception in thread "main" java.lang.SecurityException: Invalid signature file digest for Manifest main attributes
    at sun.security.util.SignatureFileVerifier.processImpl(SignatureFileVerifier.java:286)
    at sun.security.util.SignatureFileVerifier.process(SignatureFileVerifier.java:239)
    at java.util.jar.JarVerifier.processEntry(JarVerifier.java:317)
    at java.util.jar.JarVerifier.update(JarVerifier.java:228)
    at java.util.jar.JarFile.initializeVerifier(JarFile.java:348)
    at java.util.jar.JarFile.getInputStream(JarFile.java:415)
    at sun.misc.URLClassPath$JarLoader$2.getInputStream(URLClassPath.java:775)
    at sun.misc.Resource.cachedInputStream(Resource.java:77)
    at sun.misc.Resource.getByteBuffer(Resource.java:160)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:436)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)

这是因为打包了一些元信息文件进包里面导致的,可以将其删除:

zip -d jarname.jar META-INF/*.RSA META-INF/*.DSA META-INF/*.SF

8、scala插件

如果maven项目中有scala代码,需要在pom.xml中添加以下内容:

<build>
    <!--<sourceDirectory>src/main/scala</sourceDirectory>-->
    <plugins>
         <plugin>
            <groupId>net.alchim31.maven</groupId>
            <artifactId>scala-maven-plugin</artifactId>
            <version>3.2.0</version>
            <executions>
                <execution>
                    <id>compile-scala</id>
                    <phase>compile</phase>
                    <goals>
                        <goal>add-source</goal>
                        <goal>compile</goal>
                    </goals>
                </execution>
                <execution>
                    <id>test-compile-scala</id>
                    <phase>test-compile</phase>
                    <goals>
                        <goal>add-source</goal>
                        <goal>testCompile</goal>
                    </goals>
                </execution>
            </executions>
            <configuration>
                <scalaVersion>2.11.8</scalaVersion>
            </configuration>
        </plugin>

9、自定义变量

定义时:

<properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <project.version>0.2</project.version>
</properties>

使用时:

<groupId>com.lujinhong</groupId>
<artifactId>lujinhong-commons</artifactId>
<version>${project.version}</version>

10、依赖本地项目

先将本地项目mvn install到本地,然后在项目中添加repo:

<repositories>
    <repository>
        <id>in-project</id>
        <name>In Project Repo</name>
        <url>file:///Users/liaoliuqing/.m2/repository</url>
    </repository>
</repositories>

最后就可以正常的使用dependency了,如:

<dependency>
       <groupId>com.lujinhong</groupId>
        <artifactId>lujinhong-commons-hadoop</artifactId>
        <version>${project.version}</version>
</dependency>

11、关于version的说明:不要用变成充当version

尤其是依赖于父项目的子模块,不然会有各种问题:http://stackoverflow.com/questions/1981151/warning-on-using-project-parent-version-as-the-version-of-a-module-in-maven-3

[WARNING] ‘version’ contains an expression but should be a constant. @ com.lujinhong:lujinhong-commons:${project.version}, /Users/liaoliuqing/99_Project/lujinhong-commons/pom.xml, line 9, column 11

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值