haddop学习资料-02-安装方式

在root目录下执行

关闭本次防风墙 :
service iptables stop
chkconfig iptables off

查询虚拟机防火墙的当前状态

service iptables status

配置主机名: vim /etc/sysconfig/network

删除一行用d +end

HOSTNAME=hadoop01

使当前文件生效 source /etc/sysconfig/network
这种配置方法只能重新才能使当前配置生效,所以这里可以执行一下啊hostname hadoop01,这种只是使本次生效,重启无效。
也可以用hostname ,查询主机名。

配置hosts文件 vim /etc/hosts

先按Esc 再按shift+q 输入wq 保存退出

设置免密登录:
在hadoop01节点执行:ssh-keygen 然后一直回车 》》》生产公钥私钥

可以把公钥同步给指定的主机:ssh-copy-id -i ~/.ssh/id_rsa.pub root@hadoop01 (或者别的主机)

如果用当前命令发现/usr/bin/ssh-copy-id: ERROR: No identities found 保错。
请用ssh-keygen -t dsa 重新生产公钥。
ssh-copy-id -i ~/.ssh/id_rsa.pub root@hadoop01

可以测试一下是否可以免密登录:sshroot@hadoop01

登录成功

退出登录直接logout

查询一下啊java版本 Java -version
可以配置环境变量 /etc/profile
#set java env
JAVA_HOME=/usr/local/src/java/jdk1.7.0_51
JAVA_BIN=/usr/local/src/java/jdk1.7.0_51/bin
PATH= J A V A _ H O M E / b i n : JAVA\_HOME/bin: JAVA_HOME/bin:PATH
CLASSPATH=.: J A V A _ H O M E / l i b / d t . j a r : JAVA\_HOME/lib/dt.jar: JAVA_HOME/lib/dt.jar:JAVA_HOME/lib/tools.jar
export JAVA_HOME JAVA_BIN PATH CLASSPATH
如果执行java -version 报错,请看一下啊配置文件是否正确,且java是否安装 目录为上面的javahome

首先切换到 :cd /usr/local/src/ 目录下 ,执行mkdirjava 创建Java目录 吧java安装包(Linux上面版本导入)。
执行:tar -zxvf jdk-7u51-linux-x64.tar.gz


再执行java-version

如果想卸载JDK 执行: rpm -ejdk1.7.0_51 就可以卸载当前jdk

上传hadoop安装包 执行cd /usr/local/src/ 创建文件夹mkdirhadoop 切换目录cd hadoop 直接把安装包拉进去。
执行:tar -zxvf hadoop-2.7.6.tar.gz

查询hadoop配置文件,切换路径cd etc/hadoop/

/usr/local/src/hadoop/hadoop-2.7.1/etc/hadoop

还有sbin目录

复制,回车继续编辑。

修改这俩个位置。

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->

<!-- Put site-specific property overrides in this file. -->

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://hadoop01:9000</value>
    </property>
    <property>
        <name>hadoop.tmp.dir</name>
        <value>/usr/local/src/hadoop/hadoop-2.7.1/tmp</value>
    </property>
    <property>
        <name>ha.zookeeper.quorum</name>
        <value>hadoop01:2181,hadoop02:2181,hadoop03:2181</value>
    </property>

</configuration>

hdfs.site.xml

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->

<!-- Put site-specific property overrides in this file. -->

<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
</configuration>
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值