I try to execute a simple Scala script using Spark as described in the Spark Quick Start Tutorial. I have not troubles to execute the following Python code:
"""SimpleApp.py"""
from pyspark import SparkContext
logFile = "tmp.txt" # Should be some file on your system
sc = SparkContext("local", "Simple App")
logData = sc.textFile(logFile).cache()
numAs = logData.filter(lambda s: 'a' in s).count()
numBs = logData.filter(lambda s: 'b' in s).count()
print "Lines with a: %i, lines with b: %i" % (numAs, numBs)
I execute this code using the following command:
/home/aaa/spark/spark-2.1.0-bin-hadoop2.7/bin/spark-submit hello_world.py
However, if I try to do the same using Scala, I have technical problems. In more detail, the code that I try to execute is:
* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object SimpleApp {
def main(args: Array[String]) {
val logFile = "tmp.txt" // Should be some file on your system
val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
val logData = sc.textFile(logFile, 2).cache()
val numAs = logData.filter(line => line.contains("a")).count()
val numBs = logData.filter(line => line.contains("b")).count()
println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
}
}
I try to execute it in the following way:
/home/aaa/spark/spark-2.1.0-bin-hadoop2.7/bin/spark-submit hello_world.scala
As the result I get the following error message:
Error: Cannot load main class from JAR file
Does anybody know what I am doing wrong?
解决方案
I want to add to @JacekLaskowski's an alternative solution I use sometimes for POC or tests purposes.
It would be to use the script.scala from inside the spark-shell with :load.
:load /path/to/script.scala
You won't need to define a SparkContext/SparkSession as the script will use the variables defined in the scope of the REPL.
You also don't need to wrap the code in a Scala object.
PS: I consider this more as a hack and not to use for production purposes.