spark-sql 中不能使用hive创建表,报/user/hive/warehouse is not a directory or unable to create one 错误
2.方案
把$HIVE_HOME/conf/hive-site.xml复制到$SPARK_HOME/conf/下
在$SPARK_HOME/conf/hive-site.xml修改hive.metastore.warehouse.dir的属性值.其默认属性值是/user/hive/warehouse
重启spark-sql
修改后文件为:
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>123</value>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>hdfs://192.168.2.181:9000/user/hive/warehouse</value>
<description>hive.metastore.warehouse.dir</description>
</property>
注意:hdfs://192.168.2.181 为本人hdfs 地址,使用时请换成对应的地址!!!
</configuration>