我的机器:
已经安装好Hadoop Java Scala Hive
Master:192.168.244.129
Slave1.hadoop:192.168.244.128
Slave2.hadoop:192.168.244.130
在之前的工作中比如:Hadoop Hive 等环境都已经配置正确
在启动用一下命令启动Spark集群时无问题:
./sbin/start-all.sh
使用JPS查看:
1322 Jps
1297 Woerker
1526 Jps
1452 Master
接着启动spark-shell时遇到问题:
17/07/25 05:48:37 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/07/25 05:48:48 WARN metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
17/07/25 05:48:48 WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1053)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:130)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:130)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.s