1.问题:spark往Hive表里写数据时,出现hdfs权限安全问题
2.原因:本地用户没有权限
2.解决方案:修改hdfs配置文件
进入Hadoop安装目录下修改 /etc/hadoop/hdfs-site.xml
,
添加如下配置信息
<property>
<name>dfs.permissions</name>
<value>false</value>
<description>
If "true", enable permission checking in HDFS.
If "false", permission checking is turned off,
but all other behavior is unchanged.
Switching from one parameter value to the other does not change the mode,
owner or group of files or directories.
</description>
</property>
然后重启hdfs集群,重启Hive相关服务,重新运行即可