配置hadoop集群问题

配置hadoop集群问题

  (2013-06-11 15:58:48)
  分类: Hadoop/Spark技术

1linuxjps command not found

hadoop启动,使用命令jps,可是却提示找不到命令,

hadoop执行jps 
jps -bash: jps: command not found

这条命令是在jdk下的bin目录下的一个可执行文件,我查看了一下我的jdk目录,发现有jps可执行文件,但是只是没有放在环境变量里面而已,环境变量可以通过etho $PATH命令查看。

所以就要自己加上去,以root身份vi /etc/profile,然后在下面加一行export PATH="usr/java/jdk160_05/bin:$PATH",其中橘色的部分是你把jdk安装在哪的路径和jdk文件夹名称。保存退出。

然后source /etc/profile就可以,没报错就说明是成功了,再执行jps就看到了。

2、配置环境变量错误,导致输入命令报错

ls -bash: ls: command not found

解决方案:
 export PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin

重新配置/etc/profile

3、hadoop集群搭建问题Permission denied

[grid@HD1 bin]$ start-all.sh
Warning: $HADOOP_HOME is deprecated.

starting namenode, logging to /opt/hadoop-1.2.0/libexec/../logs/hadoop-grid-namenode-HD1.out
/opt/hadoop-1.2.0/bin/hadoop-daemon.sh: line 138: /tmp/hadoop-grid-namenode.pid: Permission denied
192.168.100.37: starting datanode, logging to /opt/hadoop-1.2.0/libexec/../logs/hadoop-grid-datanode-HD3.out
192.168.100.36: starting datanode, logging to /opt/hadoop-1.2.0/libexec/../logs/hadoop-grid-datanode-HD2.out
192.168.100.36: /opt/hadoop-1.2.0/bin/hadoop-daemon.sh: line 138: /tmp/hadoop-grid-datanode.pid: Permission denied
192.168.100.35: starting secondarynamenode, logging to /opt/hadoop-1.2.0/libexec/../logs/hadoop-grid-secondarynamenode-HD1.out
192.168.100.35: /opt/hadoop-1.2.0/bin/hadoop-daemon.sh: line 138: /tmp/hadoop-grid-secondarynamenode.pid: Permission denied
starting jobtracker, logging to /opt/hadoop-1.2.0/libexec/../logs/hadoop-grid-jobtracker-HD1.out
/opt/hadoop-1.2.0/bin/hadoop-daemon.sh: line 138: /tmp/hadoop-grid-jobtracker.pid: Permission denied
192.168.100.36: starting tasktracker, logging to /opt/hadoop-1.2.0/libexec/../logs/hadoop-grid-tasktracker-HD2.out
192.168.100.36: /opt/hadoop-1.2.0/bin/hadoop-daemon.sh: line 138: /tmp/hadoop-grid-tasktracker.pid: Permission denied
192.168.100.37: starting tasktracker, logging to /opt/hadoop-1.2.0/libexec/../logs/hadoop-grid-tasktracker-HD3.ou
解决办法:
1.)在hadoop-config中修改hadoop-env.sh,添加:export HADOOP_PID_DIR=$HADOOP_HOME/run/tmp。改变pid的路径。3台机子一块改。
2.)在/etc中修改profile,添加:export HADOOP_PID_DIR=$HADOOP_HOME/run/tmp,同样,改3台机子。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值