配置好hive后,spark sql 配置就很简单了,实际上只要配置一个hive-site.xml文件,拷贝到spark/conf目录下。这个文件是给spark使用的,hive的配置文件用自己的配置文件。
1. 配置hive-site.xml
<configuration>
<property>
<name>hive.metastore.uris</name>
<value>thrift://localhost:9083</value>
<description>Thrift URI for the remote metastore. Used by metastore client to
connect to remote metastore.</description>
</property>
<property>
<name>hive.server2.thrift.min.worker.threads</name>
<value>5</value>
<description>Minimum number of Thrift worker threads</description>
</property>
<property>
<name>hive.server2.thrift.max.worker.threads</name>
<value>500</value>
<description>Maximum number of Thrift worker threads</description>
</property>
<property>
<name&g