spark版本3.0.1和2.4.3
目录
场景3:scala-compille-first未生效,报符号找不到
1.Could not transfer artifact xxx from/to xxx
一.scala-compile-first
场景一:
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ spark-unsafe_2.11 ---
[INFO] D:\workspace\idea\spark\common\unsafe\src\test\java:-1: info: compiling
[INFO] D:\workspace\idea\spark\common\unsafe\src\test\scala:-1: info: compiling
[INFO] Compiling 6 source files to D:\workspace\idea\spark\common\unsafe\target\scala-2.11\test-classes at 1618055329437
[ERROR] D:\workspace\idea\spark\common\unsafe\src\test\scala\org\apache\spark\unsafe\types\UTF8StringPropertyCheckSuite.scala:26: error: object UTF8String is not a member of package org.apache.spark.unsafe.types
scala的测试类中引入了java类,但是这个插件scala-maven-plugin中,在testCompile时,先编译了scala,而java没有编译,则报错。(一个项目里,有时需要先编译scala,有时需要先编译Java,因而该插件在不同场景下要灵活使用。)
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<!-- 3.3.1 won't work with zinc; fails to find javac from java.home -->
<version>3.2.2</version>
<executions>
<execution>
<id>eclipse-add-source</id>
<goals>
<goal>add-source</goal>
</goals>
</execution>
<execution>
<id>scala-compile-first</id>
<goals>
<goal>compile</goal>
</goals>
</execution>
<!-- <execution>
<id>scala-test-compile-first</id>
<goals>
<goal>testCompile</goal>
</goals>
</execution>-->
</executions>
....
注释掉之后,构建成功。
[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @ spark-unsafe_2.11 --- [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-unsafe_2.11 ---
场景二:
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-core_2.11 ---
[INFO] D:\workspace\idea\spark\core\src\main\java:-1: info: compiling
[INFO] D:\workspace\idea\spark\core\src\main\scala:-1: info: compiling
[INFO] Compiling 575 source files to D:\workspace\idea\spark\core\target\scala-2.11\classes at 1618056828208
[ERROR] D:\workspace\idea\spark\core\src\main\scala\org\apache\spark\Aggregator.scala:20: error: object DeveloperApi is not a member of package org.apache.spark.annotation
[ERROR] import org.apache.spark.annotation.DeveloperApi
点进去发现,compile阶段,在scala类中引入了java类,但是先编译了scala,导致找不到java类。将该插件中的如下部分注释即可。
<execution>
<id>scala-compile-first</id>
<goals>
<goal>compile</goal>
</goals>
</execution>
场景3:scala-compille-first未生效,报符号找不到
将该插件的scala-compile-first生命周期调整到process-resources阶段。
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<!-- 3.3.1 won't work with zinc; fails to find javac from java.home -->
<version>3.2.2</version>
<executions>
<execution>
<id>eclipse-add-source</id>
<goals>
<goal>add-source</goal>
</goals>
</execution>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>add-source</goal>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>scala-test-compile-first</id>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
<recompileMode>incremental</recompileMode>
<useZincServer>true</useZincServer>
<args>
<arg>-unchecked</arg>
<arg>-deprecation</arg>
<arg>-feature</arg>
<arg>-explaintypes</arg>
<arg>-Yno-adapted-args</arg>
<arg>-nobootcp</arg>
</args>
<jvmArgs>
<jvmArg>-Xms1024m</jvmArg>
<jvmArg>-Xmx1024m</jvmArg>
<jvmArg>-XX:ReservedCodeCacheSize=${CodeCacheSize}</jvmArg>
</jvmArgs>
<javacArgs>
<javacArg>-source</javacArg>
<javacArg>${java.version}</javacArg>
<javacArg>-target</javacArg>
<javacArg>${java.version}</javacArg>
<javacArg>-Xlint:all,-serial,-path,-try</javacArg>
</javacArgs>
</configuration>
</plugin>
调整了生命周期后,看到已经开始编译scala文件了
[INFO] Compiling 495 Scala sources and 81 Java sources to D:\workspace\idea\spark\core\target\scala-2.11\classes...
二.其他问题
1.Could not transfer artifact xxx from/to xxx
Could not transfer artifact org.apache:apache:pom:18 from/to maven-net-cn (http://maven.net.cn/content/groups/public/): Transfer failed for http://maven.net.cn/content/groups/public/org/apache/apache/18/apache-18.pom
下载相关的依赖包时,没有下载成功,需要找到对应的maven库包,删除以 .lastUpdated 结尾的文件,然后重新下载。如果还不行可以修改仓库镜像,重新import并下载。
<mirror>
<id>alimaven</id>
<mirrorOf>central</mirrorOf>
<name>aliyun maven</name>
<url>http://maven.aliyun.com/nexus/content/repositories/central/</url>
</mirror>
<mirror>
<id>alimaven</id>
<name>aliyun maven</name>
<url>http://maven.aliyun.com/nexus/content/groups/public/</url>
<mirrorOf>central</mirrorOf>
</mirror>
2.An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "bash": CreateProcess error=2, 系统找不到指定的文件。
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.8:run (default) on project spark-core_2.12: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "bash": CreateProcess error=2, 系统找不到指定的文件。
[ERROR] around Ant part ...<exec executable="bash">... @ 4:27 in D:\Users\jeffrey.miao\IdeaProjects\spark-src\core\target\antrun\build-main.xml
如果不需要生成构建信息,可以将maven-antrun-plugin插件注释:(或者skip配置为true)
(skip配置为true:即在plugin的子标签configuration中,增加skip标签,并设置为true.)
<!-- <plugin>-->
<!-- <groupId>org.apache.maven.plugins</groupId>-->
<!-- <artifactId>maven-antrun-plugin</artifactId>-->
<!-- <executions>-->
<!-- <execution>-->
<!-- <phase>generate-resources</phase>-->
<!-- <configuration>-->
<!-- <!– Execute the shell script to generate the spark build information. –>-->
<!-- <target>-->
<!-- <exec executable="bash">-->
<!-- <arg value="${project.basedir}/../build/spark-build-info"/>-->
<!-- <arg value="${project.build.directory}/extra-resources"/>-->
<!-- <arg value="${project.version}"/>-->
<!-- </exec>-->
<!-- </target>-->
<!-- </configuration>-->
<!-- <goals>-->
<!-- <goal>run</goal>-->
<!-- </goals>-->
<!-- </execution>-->
<!-- </executions>-->
<!-- </plugin>-->
3.scalastyle-maven-plugin
Failed to execute goal org.scalastyle:scalastyle-maven-plugin:1.0.0:check (default) on project spark-core_2.12: Failed during scalastyle execution
[ERROR] Failed to execute goal org.scalastyle:scalastyle-maven-plugin:1.0.0:check (default) on project spark-core_2.12: Failed during scalastyle execution: Unable to find configuration file at location scalastyle-config.xml -> [Help 1]
代码风格的插件,可以直接注释(根pom中)(或者skip配置为true)。
<!--<plugin>
<groupId>org.scalastyle</groupId>
<artifactId>scalastyle-maven-plugin</artifactId>
<version>1.0.0</version>
<configuration>
<verbose>false</verbose>
<failOnViolation>true</failOnViolation>
<includeTestSourceDirectory>false</includeTestSourceDirectory>
<failOnWarning>false</failOnWarning>
<sourceDirectory>${basedir}/src/main/scala</sourceDirectory>
<testSourceDirectory>${basedir}/src/test/scala</testSourceDirectory>
<configLocation>scalastyle-config.xml</configLocation>
<outputFile>${basedir}/target/scalastyle-output.xml</outputFile>
<inputEncoding>${project.build.sourceEncoding}</inputEncoding>
<outputEncoding>${project.reporting.outputEncoding}</outputEncoding>
</configuration>
<executions>
<execution>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
</plugin>-->
4.maven-checkstyle-plugin
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:3.1.0:check (default) on project spark-core_2.12: Failed during checkstyle configuration: cannot initialize module SuppressionFilter - Unable to find: dev/checkstyle-suppressions.xml -> [Help 1]
Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:3.1.0:check (default) on project spark-core_2.12: Failed during checkstyle configuration
代码风格check的插件,直接注释(或者skip配置为true)
<!--<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-checkstyle-plugin</artifactId>
<version>3.1.0</version>
<configuration>
<failOnViolation>false</failOnViolation>
<includeTestSourceDirectory>true</includeTestSourceDirectory>
<sourceDirectories>
<directory>${basedir}/src/main/java</directory>
<directory>${basedir}/src/main/scala</directory>
<directory>${basedir}/v${hive.version.short}/src/main/java</directory>
<directory>${basedir}/v${hive.version.short}/src/main/scala</directory>
</sourceDirectories>
<testSourceDirectories>
<directory>${basedir}/src/test/java</directory>
<directory>${basedir}/v${hive.version.short}/src/test/java</directory>
<directory>${basedir}/v${hive.version.short}/src/test/scala</directory>
</testSourceDirectories>
<configLocation>dev/checkstyle.xml</configLocation>
<outputFile>${basedir}/target/checkstyle-output.xml</outputFile>
<inputEncoding>${project.build.sourceEncoding}</inputEncoding>
<outputEncoding>${project.reporting.outputEncoding}</outputEncoding>
</configuration>
<dependencies>
<dependency>
<groupId>com.puppycrawl.tools</groupId>
<artifactId>checkstyle</artifactId>
<version>8.29</version>
</dependency>
</dependencies>
<executions>
<execution>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
</plugin>-->
本文详细介绍了在编译Spark 3.0.1和2.4.3版本时遇到的Maven问题,包括`scala-compile-first`插件在不同场景下的使用,以及在编译过程中遇到的依赖下载失败、Ant BuildException、scalastyle-maven-plugin和maven-checkstyle-plugin的相关问题及解决方案。
3721

被折叠的 条评论
为什么被折叠?



