1、用./bin/spark-shell启动spark时遇到异常:java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!
解决方法:add export SPARK_LOCAL_IP="127.0.0.1" to spark-env.sh
2、java Kafka producer error:ERROR kafka.utils.Utils$ - fetching topic metadata for topics [Set(words_topic)] from broker [ArrayBuffer(id:0,host: xxxxxx,port:9092)] failed
解决方法:Set 'advertised.host.name' on server.properties of Kafka broker to server's realIP(same to producer's 'metadata.broker.list' property)
3、java.net.NoRouteToHostException: No route to host
解决方法:zookeeper的IP要配对
4、Fatal error during KafkaServer startup. Prepare to shutdown (kafka.server.KafkaServer) java.net.UnknownHostException: linux-pic4.site:
解决方法:add your hostname to /etc/hosts: 127.0.0.1 localhost linux-pic4.site
5、org.apache.spark.SparkException: A master URL must be set in your configuration
解决方法:SparkConf sparkConf = new SparkConf().setAppName("JavaDirectKafkaWordCount").setMaster("local");
6、Failed to locate the winutils binary in the hadoop binary path
解决方法:先安装好hadoop
7、启动spark时: Failed to get database default, returning NoSuchObjectException
解决方法:1)Copy winutils.exe from here(https://github.com/steveloughran/winutils/tree/master/hadoop-2.6.0/bin) to some folder say, C:\Hadoop\bin. Set HADOOP_HOME to C:\Hadoop.2)Open admin command prompt. Run C:\Hadoop\bin\winutils.exe chmod 777 /tmp/hive
8、org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true.
解决方法:Use this construct