在提交spark任务时报,SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.的异常。
Warning: Local jar /usr/local/spark-2.1.0-bin-hadoop2.6/conf/hdfs-site.xml does not exist, skipping.
Warning: Local jar /Users/haizhi/Documents/My_Git/bigdata_data_bin/conf/log4j.properties does not exist, skipping.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/haizhi/Documents/My_Git/bigdata_data_bin/jar/onlineprocess.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/spark-2.1.0-bin-hadoop2.6/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
ERROR - Error initializing SparkContext.
org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
at org.apache.spark.SparkConf$$anonfun$validateSettings$7$$anonfun$apply$8.apply(SparkConf.scala:560)
at org.apache.spark.SparkConf$$anonfun$validateSettings$7$$anonfun$apply$8.apply(SparkConf.scala:558)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.SparkConf$$anonfun$validateSettings$7.apply(SparkConf.scala:558)
at org.apache.spark.SparkConf$$anonfun$validateSettings$7.apply(SparkConf.scala:546)
at scala.Option.foreach(Option.scala:257)
深度注释掉spark-env.sh
中的SPARK_CLASSPATH
配置。此配置是在安装hbase时配置的。
注释掉后,解决。