Spark1.3+ubuntu 14.10+hadoop 2.6(单机版)

单机版
在Ubuntu下创建hadoop用户组和用户
hadoop的管理员最好就是以后要登录桌面环境运行eclipse的用户,否则后面会有拒绝读写的问题出现。当然不是也有办法办法解决。

1. 创建hadoop用户组;

sudo addgroup hadoop

2. 创建hadoop用户;

sudo adduser -ingroup hadoop hadoop

3. 给hadoop用户添加权限,打开/etc/sudoers文件;

sudo gedit /etc/sudoers


在root ALL=(ALL:ALL) ALL下添加hadoop ALL=(ALL:ALL) ALL.

在Ubuntu下安装JDK

具体见:http://blog.csdn.net/ggz631047367/article/details/42366687 //JAVA_HOME=/usr/lib/jvm/jdk1.8.0_25

安装ssh服务
sudo apt-get install ssh openssh-server


建立ssh无密码登录本机

切换到hadoop用户,执行以下命令:
su - hadoop

[color=red] ssh生成密钥有rsa和dsa两种生成方式,默认情况下采用rsa方式。

1. 创建ssh-key,,这里我们采用rsa方式;
ssh-keygen -t rsa -P "" (注:回车后会在~/.ssh/下生成两个文件:id_rsa和id_rsa.pub这两个文件是成对出现的)

2. 进入~/.ssh/目录下,将id_rsa.pub追加到authorized_keys授权文件中,开始是没有authorized_keys文件的;
cd ~/.ssh
cat id_rsa.pub >> authorized_keys (完成后就可以无密码登录本机了。)

3. 登录localhost;
ssh localhost

4. 执行退出命令;
exit [/color]

[color=red]安装hadoop(下载prebuilt无需下载编译版)

下载地址:http://apache.fayea.com/hadoop/common/stable/hadoop-2.6.0.tar.gz

1. 把hadoop解压到/usr/local下:
sudo tar -zxvf hadoop-2.6.0.tar.gz
sudo mv hadoop-2.6.0 /usr/local/hadoop
sudo chmod -R 775 /usr/local/hadoop
sudo chown -R hadoop:hadoop /usr/local/hadoop //否则ssh会拒绝访问

2.配置

修改bashrc的配置:
sudo gedit ~/.bashrc
[/color]

在文件末尾添加:

#HADOOP VARIABLES START

export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_25

export HADOOP_INSTALL=/usr/local/hadoop

export PATH=$PATH:$HADOOP_INSTALL/bin

export PATH=$PATH:$HADOOP_INSTALL/sbin

export HADOOP_MAPRED_HOME=$HADOOP_INSTALL

export HADOOP_COMMON_HOME=$HADOOP_INSTALL

export HADOOP_HDFS_HOME=$HADOOP_INSTALL

export YARN_HOME=$HADOOP_INSTALL

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native

export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"

#HADOOP VARIABLES END


如果不知道JAVA_HOME可以通过命令获得:

update-alternatives --config java

目录取到java根目录即可。


执行下面命令使改动生效:
source ~/.bashrc

修改hadoop-env.sh的配置:

[html] view plaincopy

sudo gedit /usr/local/hadoop/etc/hadoop/hadoop-env.sh
export JAVA_HOME=/usr/local/jvm/jdk1.7.0_79

找到JAVA_HOME改为上面的值。
配置环境变量
vi /etc/profile

[color=red]新增hadoop环境变量(按照自己的安装路径配置)
export JAVA_HOME=/usr/local/jvm/jdk1.7.0_79
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib:$HADOOP_HOME/lib
export PATH=${JAVA_HOME}/bin:$PATH:$ANT_HOME/bin:$HADOOP_HOME/bin

检测是否安装成功
hadoop version
出现如下信息,安装成功
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar
[/color]

[color=red]测试

通过执行hadoop自带实例WordCount验证是否安装成功

/usr/local/hadoop路径下创建input文件夹
mkdir input
cp README.txt input

在hadoop目录下执行WordCount:
bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0-sources.jar wordcount input output [/color]
----------------------------------------------------------------------
master@master-VirtualBox:~$ sudo cp /mnt/shared/hadoop-2.6.0.tar.gz /usr/local/
master@master-VirtualBox:~$ tar -zxf /usr/local/hadoop-2.6.0.tar.gz
master@master-VirtualBox:~$ sudo tar -zxf /usr/local/hadoop-2.6.0.tar.gz
master@master-VirtualBox:~$ cd /usr/local/

master@master-VirtualBox:/usr/local$ sudo tar zxf hadoop-2.6.0.tar.gz
master@master-VirtualBox:/usr/local$ cd hadoop-2.6.0/

master@master-VirtualBox:/usr/local/hadoop-2.6.0$ cd etc/hadoop/

master@master-VirtualBox:/usr/local/hadoop-2.6.0/etc/hadoop$ sudo gedit hadoop-env.sh
export JAVA_HOME=/usr/local/jvm/jdk1.7.0_79
master@master-VirtualBox:/usr/local/hadoop-2.6.0/etc/hadoop$ cd ~
master@master-VirtualBox:~$ sudo gedit .bashrc
/usr/local/hadoop-2.6.0/bin添加到path中
master@master-VirtualBox:~$ source .bashrc
master@master-VirtualBox:~$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /usr/local/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
master@master-VirtualBox:/usr/local/hadoop-2.6.0$ sudo mkdir input
master@master-VirtualBox:/usr/local/hadoop-2.6.0$ sudo cp README.txt input/
master@master-VirtualBox:/usr/local/hadoop-2.6.0$ bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar wordcount input output
如果出现错误是因为hadoop权限不够
master@master-VirtualBox:/usr/local/hadoop-2.6.0$ cd hadoop-2.6.0/
master@master-VirtualBox:/usr/local/hadoop-2.6.0$ sudo chmod -R 777 hadoop-2.6.0
master@master-VirtualBox:/usr/local/hadoop-2.6.0$ bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar wordcount input output
master@master-VirtualBox:/usr/local/hadoop-2.6.0$ cat output/*
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值