在Intellij idea中测试写好的Spark程序,运行时报如下错误:
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
20/06/14 21:45:56 INFO SparkContext: Running Spark version 2.3.4
20/06/14 21:45:57 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: A master URL must be set in your configuration
at org.apache.spark.SparkContext.(SparkContext.scala:367)
at org.apache.spark.SparkContext
.
g
e
t
O
r
C
r
e
a
t
e
(
S
p
a
r
k
C
o
n
t
e
x
t
.
s
c
a
l
a
:
2493
)
a
t
o
r
g
.
a
p
a
c
h
e
.
s
p
a
r
k
.
s
q
l
.
S
p
a
r
k
S
e
s
s
i
o
n
.getOrCreate(SparkContext.scala:2493) at org.apache.spark.sql.SparkSession
.getOrCreate(SparkContext.scala:2493)atorg.apache.spark.sql.SparkSessionBuilderKaTeX parse error: Can't use function '$' in math mode at position 8: anonfun$̲7.apply(SparkSe…anonfun
7.
a
p
p
l
y
(
S
p
a
r
k
S
e
s
s
i
o
n
.
s
c
a
l
a
:
925
)
a
t
s
c
a
l
a
.
O
p
t
i
o
n
.
g
e
t
O
r
E
l
s
e
(
O
p
t
i
o
n
.
s
c
a
l
a
:
121
)
a
t
o
r
g
.
a
p
a
c
h
e
.
s
p
a
r
k
.
s
q
l
.
S
p
a
r
k
S
e
s
s
i
o
n
7.apply(SparkSession.scala:925) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession
7.apply(SparkSession.scala:925)atscala.Option.getOrElse(Option.scala:121)atorg.apache.spark.sql.SparkSessionBuilder.getOrCreate(SparkSession.scala:925)
at com.taxi.TaxiApp$.main(TaxiApp.scala:15)
at com.taxi.TaxiApp.main(TaxiApp.scala)
解决方法:
点击run
选择edit configuration
在VM options里写入-Dspark.master=local
-Dspark.master=local:表示设置我的spark程序以local模式运行.
参考文献:A master URL must be set in your configuration at org.apache.spark.SparkContext.(SparkContext.