这个问题出现在sparksession 读取文件时报错,很莫名其妙,感觉是spark sql 在软件测试时忽略,不能去读取文件
解决办法是在打包时声明
maven 打包
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.0</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>
META-INF/services/org.apache.spark.sql.sources.DataSourceRegister
</resource>
</transformer>
</transformers>
<finalName>${project.artifactId}-${project.version}-uber</finalName>
</configuration>
</execution>
</executions>
</plugin>
在用sbt assembly打包时可以
assemblyMergeStrategy in assembly := { { case "META-INF/services/org.apache.spark.sql.sources.DataSourceRegister" => MergeStrategy.concat } }