在windows环境下,跑Spark程序,报“tmp/hive”权限不对,用docs命令加上权限,又提示“ChangeFileModeByMask error”,网上没有一个正确的解决方案,几经折腾,最终搞定,经验分享给大家。
我在window10跑Spark离线任务,抛如下异常:
Exception in thread "main" org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxrwx---;
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala: