错误信息如下:
18/04/27 10:36:33 INFO SessionState: Created local directory: C:/Users/GUOXIA~1.HDS/AppData/Local/Temp/c5c4c419-58fb-4b84-b84e-c6c972255f12_resources
18/04/27 10:36:33 INFO SessionState: Created HDFS directory: /tmp/hive/hdfs/c5c4c419-58fb-4b84-b84e-c6c972255f12
18/04/27 10:36:33 INFO SessionState: Created local directory: C:/Users/GUOXIA~1.HDS/AppData/Local/Temp/guoxiang/c5c4c419-58fb-4b84-b84e-c6c972255f12
18/04/27 10:36:33 INFO SessionState: Created HDFS directory: /tmp/hive/hdfs/c5c4c419-58fb-4b84-b84e-c6c972255f12/_tmp_space.db
Exception in thread "main" org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: java.io.IOException: 拒绝访问。;
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
......
Caused by: java.io.IOException: 拒绝访问。
at java.io.WinNTFileSystem.createFileExclusively(Native Method)
at java.io.File.createTempFile(File.java:2024)
at org.apache.hadoop.hive.ql.session.SessionState.createTempFile(SessionState.java:818)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513)
... 34 more
分析:
看起来是/tmp文件夹权限问题,不允许你在/tmp下创建文件,所以首先试一下修改/tmp权限:
使用hadoop下的winutils工具修改tmp权限:
winutils chmod -R 777 /tmp
winutils chmod -R 777 D:/tmp
如果还不行
在代码中加入:
System.setProperty("user.name", "hdfs")
这样就解决了,至于原因。。。。。。不清楚