java sessionstate_解决:Permission denied at org.apache.hadoop.hive.ql.session.SessionState.start(Sess...

Permission denied at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)

Logging initialized using configuration in jar:file:/opt/hive/1.2.1/lib/hive-common-1.2.1.jar!/hive-log4j.properties

Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: java.io.IOException: Permission denied

at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)

at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)

at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

Caused by: java.lang.RuntimeException: java.io.IOException: Permission denied

at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:515)

... 7 more

Caused by: java.io.IOException: Permission denied

at java.io.UnixFileSystem.createFileExclusively(Native Method)

at java.io.File.createNewFile(File.java:1006)

at java.io.File.createTempFile(File.java:1989)

at org.apache.hadoop.hive.ql.session.SessionState.createTempFile(SessionState.java:818)

at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513)

... 7 more

myuser@namenode:/opt/hive/current/bin$

尝试一:

myuser@namenode:~$ sudo hive

sudo: hive: command not found

#必须到执行文件路径,sudo之后还必须指定下环境变量

myuser@namenode:/opt/hive/current/bin$ sudo source /etc/profile ; ./hive

尝试二

把hive参数hive.exec.scratchdir 设置的目录,一般为 /tmp/hive权限调整为777即可.即: chmod -R 777 /hive/tmp

hive.exec.scratchdir

/hivetest/hive

HDFS root scratch dir for Hive jobs which gets created with write all (733) permission. For each connecting user, an HDFS scratch dir: ${hive.exec.scratchdir}/<username> is created, with ${hive.scratch.dir.pe

rmission}.

发现已经是了

myuser@namenode:/opt/hive/current/bin$ hdfs dfs -ls /hivetest/

17/04/26 15:17:09 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Found 1 items

drwxrwxrwx - 777 hive 0 2017-01-10 15:44 /hivetest/hive

尝试三

修改用户组

usermod -a -G hive myuser

#检查修改

myuser@namenode:/opt/hive/current/bin$ groups

myuser adm cdrom sudo dip plugdev lpadmin sambashare

myuser@namenode:/opt/hive/current/bin$ groups myuser

myuser : myuser adm cdrom sudo dip plugdev lpadmin sambashare hive

还是不行,报原来的错误。

尝试四

切换用户

myuser@namenode:/opt/hive/current/bin$ su hive

#密码:123456

Password:

hive@namenode:/opt/hive/current/bin$

解决

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值