原因:
使用了GoldMount大数据集群(KDE)在spark提交任务之后,一直报org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe not found的错误。查找各种资料具体解决措施如下(以下配置同步集群全部节点):
(1)配置hive/conf 下的hive-site.xml :
<property>
<name>hive.aux.jars.path</name>
<value>file:///home/develop/hive-3.0.1/lib/hive-contrib-3.1.0.3.0.1-123.jar</value>
</property>
(2)配置spark/conf下的hive-site.xml:
<property>
<name>hive.aux.jars.path</name>
<value>file:///home/dev