一、问题现象
进入spark-sql命令时报错:
Error: Failed to load org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver: org/apache/hadoop/hive/cli/CliDriver
Failed to load hive class.
You need to build Spark with -Phive and -Phive-thriftserver.
二、问题原因
这是外网原文中谈及的原因
This error usually occurs when installing a Spark version without built-in Hadoop libraries (headless version) as the Spark hive and hive thrift server packages are not included.
大意是说,这个问题的原因是你安装spark时选择了without Hadoop的版本, 所以没有找到Spark hive和hive thrift server的包
三、解决方法
进入spark