【MapReduce】Mac上单机Hadoop安装

说明:

安装JDK版本:jdk1.8.0_231

安装Hadoop版本:3.2.1

 

1. 安装JDK

https://www.jianshu.com/p/a85658902f26

注意添加export那一行的时候根据自己的jdk版本更改,我的是jdk1.8.0_231.jdk,不是jdk1.8.0_211.jdk。

 

2. 安装Homebrew和Homebrew-cask

https://juejin.im/post/5cd2a50e518825356d54b847

 

3. 配置SSH localhost

 

  • 打开系统偏好设置-共享-远程登录(默认情况下关闭,将“远程登录打开即可”);
  • 生成密钥对,执行如下命令
    ssh-keygen -t rsa

     

  • 执行如下命令,将在当前用户目录中的.ssh文件夹中生成id_rsa文件并且以后可以不使用密码登录:
    cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

     

  • 使用下面命令,测试是否可以不适用密码登录:
    ssh localhost

 

4. 安装Hadoop

brew install hadoop

 

5. 配置Hadoop环境

https://www.cnblogs.com/little-YTMM/p/4394981.html

在目录/usr/local/Cellar/hadoop/3.2.1/libexec/etc/hadoop下找到hadoop-env.sh文件中找到其中一行:(有木有盆友不造怎么进入目录啊。。就是打开Finder,然后找最上面工具栏有一个“前往”-->“前往文件夹”-->将路径复制进去即可)

  • export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true"

    将其替换为:

    export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -Djava.security.krb5.realm= -Djava.security.krb5.kdc="

     

  • 在目录/usr/local/Cellar/hadoop/3.2.1/libexec/etc/hadoop下找到core-site.xml,插入如下代码:
    <configuration>
      <property>
         <name>hadoop.tmp.dir</name>  
    <value>/usr/local/Cellar/hadoop/hdfs/tmp</value>
        <description>A base for other temporary directories.</description>
      </property>
      <property>
         <name>fs.default.name</name>                                     
         <value>hdfs://localhost:9000</value>                             
      </property>                                                        
    </configuration>
  • 在目录/usr/local/Cellar/hadoop/3.2.1/libexec/etc/hadoop下找到mapred-site.xml, 在其中添加:
    <configuration>
          <property>
            <name>mapred.job.tracker</name>
            <value>localhost:9010</value>
          </property>
    </configuration>

     

  • 在目录/usr/local/Cellar/hadoop/3.2.1/libexec/etc/hadoop下找到hdfs-site.xml
    <configuration>
       <property>
         <name>dfs.replication</name>
         <value>1</value>
        </property>
    </configuration>
  • 在运行后台程序前, 必须格式化新安装的HDFS, 并通过创建存储目录和初始化元数据创新空的文件系统, 执行下面命令:
    hdfs namenode -format

     

然而我【报错】

WARNING: log4j.properties is not found. HADOOP_CONF_DIR may be incomplete.

ERROR: JAVA_HOME /Library/Java/JavaVirtualMachines/jdk1.8.0_211.jdk/Contents/Home does not exist.

【分析】

 (1)ERROR因为我在文章开头配置JDK时export把目录写成

export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_211.jdk/Contents/Home

实际上我的JDK版本是jdk1.8.0_231.jdk

所以通过以下方式打开文件更改目录:

open .bash_profile

然后:

source .bash_profile
echo $JAVA_HOME

(2)WARNING来源于:HADOOP_CONF_DIR路径写的不对

这个要去hadoop-env.sh文件中找到HADOOP_CONF_DIR,我的文件路径写的是{HADOOP_HOME}/etc/hadoop,检索一下文件路径,看哪里有问题

(base) wangwenzhuodeMacBook-Air:~ wangwenzhuo$ ls ${HADOOP_HOME}/etc/hadoop
ls: /usr/local/Cellar/hadoop/3.2.1/etc/hadoop: No such file or directory
(base) wangwenzhuodeMacBook-Air:~ wangwenzhuo$ ls ${HADOOP_HOME}/etc
ls: /usr/local/Cellar/hadoop/3.2.1/etc: No such file or directory
(base) wangwenzhuodeMacBook-Air:~ wangwenzhuo$ ls $HADOOP_HOME/etc
ls: /usr/local/Cellar/hadoop/3.2.1/etc: No such file or directory
(base) wangwenzhuodeMacBook-Air:~ wangwenzhuo$ ls $HADOOP_HOME
INSTALL_RECEIPT.json	NOTICE.txt		bin			sbin
LICENSE.txt		README.txt		libexec
(base) wangwenzhuodeMacBook-Air:~ wangwenzhuo$ ls $HADOOP_HOME/libexec
bin			hadoop-functions.sh	mapred-config.sh	yarn-config.sh
etc			hdfs-config.sh		sbin
hadoop-config.sh	libexec			share
(base) wangwenzhuodeMacBook-Air:~ wangwenzhuo$ ls $HADOOP_HOME/libexec/etc
hadoop

因此更改为{HADOOP_HOME}/libexec/etc/hadoop

 

6. 启动后台程序:在/usr/local/Cellar/hadoop/3.2.1/sbin目录下, 执行如下命令

https://blog.csdn.net/jxq0816/article/details/78736449  参考文章后半部分

start-dfs.sh  #启动HDFS
start-yarn.sh

【结果报错】

-bash: start-dfs.sh: command not found

【解决方法】逐步进入目录,在相应文件夹下执行

./start-dfs.sh
./start-yarn.sh
(base) wangwenzhuodeMacBook-Air:hadoop wangwenzhuo$ cd /usr/local/Cellar/hadoop/3.2.1
(base) wangwenzhuodeMacBook-Air:3.2.1 wangwenzhuo$ ls 
INSTALL_RECEIPT.json	README.txt		sbin
LICENSE.txt		bin
NOTICE.txt		libexec
(base) wangwenzhuodeMacBook-Air:3.2.1 wangwenzhuo$ cd sbin
(base) wangwenzhuodeMacBook-Air:sbin wangwenzhuo$ ls
FederationStateStore	refresh-namenodes.sh	stop-balancer.sh
distribute-exclude.sh	start-all.sh		stop-dfs.sh
hadoop-daemon.sh	start-balancer.sh	stop-secure-dns.sh
hadoop-daemons.sh	start-dfs.sh		stop-yarn.sh
httpfs.sh		start-secure-dns.sh	workers.sh
kms.sh			start-yarn.sh		yarn-daemon.sh
mr-jobhistory-daemon.sh	stop-all.sh		yarn-daemons.sh
(base) wangwenzhuodeMacBook-Air:sbin wangwenzhuo$ ./start-dfs.sh
Starting namenodes on [localhost]
Starting datanodes
Starting secondary namenodes [wangwenzhuodeMacBook-Air.local]
wangwenzhuodeMacBook-Air.local: Warning: Permanently added 'wangwenzhuodemacbook-air.local,172.27.130.220' (ECDSA) to the list of known hosts.
2019-10-21 10:29:06,840 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
(base) wangwenzhuodeMacBook-Air:sbin wangwenzhuo$ ./start-yarn.sh
Starting resourcemanager
Starting nodemanagers
(base) wangwenzhuodeMacBook-Air:sbin wangwenzhuo$ 

7. 验证成功:在浏览器中输入 http://localhost:50070便可访问Hadoop页面。

 http://localhost:8088

  • 1
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值