spark 添加依赖,如何在Spring-boot多模块Java 11项目中添加Spark依赖项

Whenever I am adding a module-info.java in my multi-module project I cannot import my Spark dependencies - everything else seems to be working

org.apache.spark

spark-core_2.12

3.0.0-preview2

org.apache.spark

spark-sql_2.12

3.0.0-preview2

785d0985508b66e6b7129a287012f61b.png

IntelliJ tries to readd Maven Dependency without any result.

My module-info looks like:

module common {

exports [...]

requires lombok;

requires spring.data.jpa;

requires spring.data.commons;

requires org.apache.commons.lang3;

requires spring.context;

requires spring.web;

requires spring.security.core;

requires com.google.common;

requires org.json;

requires spring.core;

requires spring.beans;

requires com.fasterxml.jackson.core;

requires com.fasterxml.jackson.databind;

requires spring.jcl;

requires spring.webmvc;

requires mongo.java.driver;

requires org.hibernate.orm.core;

requires com.fasterxml.jackson.dataformat.csv;

requires java.sql;

}

It is not possible to add org.apache.* in my module-info.java either.

Is it possible that Spark is not ready for Jigsaw modules and Java 9+?

解决方案Is it possible that spark is not ready for Jigsaw modules and Java 9+?

It does hold true for spark. Two straight reasons that I can vouch for are:

They do not have an entry for

Automatic-Module-Name:

in the artifact's MANIFEST.MF file.

If you try describing their artifacts using the jar tool

jar --describe-module --file=/spark-core_2.12-3.0.0-preview2.jar

This would fail to derive the module descriptor for a similar reason as mentioned in this answer.

Few resources that might be useful once you reach here:

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值