Spark2.0的Caused by: java.net.URISyntaxException: Relative path in absolute URI: file错误

在使用spark2.0的时候遇到如下错误:

16/09/21 14:12:22 INFO SharedState: Warehouse path is 'file:E:\scalacode_v2\Spark2Pro/spark-warehouse'.
Exception in thread "main" java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: file:E:/scalacode_v2/Spark2Pro/spark-warehouse
    at org.apache.hadoop.fs.Path.initialize(Path.java:206)
    at org.apache.hadoop.fs.Path.<init>(Path.java:172)
    at org.apache.spark.sql.catalyst.catalog.SessionCatalog.makeQualifiedPath(SessionCatalog.scala:114)
    at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createDatabase(SessionCatalog.scala:145)
    at org.apache.spark.sql.catalyst.catalog.SessionCatalog.<init>(SessionCatalog.scala:89)
    at org.apache.spark.sql.internal.SessionState.catalog$lzycompute(SessionState.scala:95)
    at org.apache.spark.sql.internal.SessionState.catalog(SessionState.scala:95)
    at org.apache.spark.sql.internal.SessionState$$anon$1.<init>(SessionState.scala:112)
    at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:112)
    at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:111)
    at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
    at org.apache.spark.sql.Dataset.<init>(Dataset.scala:161)
    at org.apache.spark.sql.Dataset.<init>(Dataset.scala:167)
    at org.apache.spark.sql.Dataset$.apply(Dataset.scala:59)
    at org.apache.spark.sql.SparkSession.createDataset(SparkSession.scala:441)
    at org.apache.spark.sql.SQLContext.createDataset(SQLContext.scala:395)
    at org.apache.spark.sql.SQLImplicits.rddToDatasetHolder(SQLImplicits.scala:163)
    at com.yisa.test.Test$.main(Test.scala:24)
    at com.yisa.test.Test.main(Test.scala)
Caused by: java.net.URISyntaxException: Relative path in absolute URI: file:E:/scalacode_v2/Spark2Pro/spark-warehouse
    at java.net.URI.checkPath(Unknown Source)
    at java.net.URI.<init>(Unknown Source)
    at org.apache.hadoop.fs.Path.initialize(Path.java:203)
    ... 18 more


google了一下后发现如下解析:

The default value of `spark.sql.warehouse.dir` is `System.getProperty("user.dir")/spark-warehouse`. Since `System.getProperty("user.dir")` is a local dir, we should explicitly set the scheme to local filesystem.

就是说我们需要添加一个配置spark.sql.warehouse.dir,如果不添加上该配置,默认是找的user.dir下面的目录。这个其实是没有的。所以报错。

所以,我们需要添加配置。

    val sparkSession = SparkSession.builder()
    .master("local[2]")
    .appName("example")
    .config("spark.sql.warehouse.dir", "file:///e:/tmp/spark-warehouse")
    .getOrCreate()


添加之后。成功运行。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值