win10编译hadoop3.2.1

概述

搞大数据,hadoop是核心组件,因为开源,并且体现庞杂,难免遇到一些坑,在遇到问题的时候很多情况下就需要来翻源码,如果有bug还得改改bug然后重新编译。所以在windows上编译hadoop还是需要弄一下的。hadoop不像一般的那些java小项目,maven编译命令一跑,就把源码给你编译成jar包了。hadoop里面依赖的东西比较多,所以在windows下编码相对来说还是比较麻烦的。

hadoop源码根目录有一个BUILDING.txt文件,里面有各个平台的编译指引,windows下的编译指引是下面这段描述:

按照这段指引来编译,会让你少走很多弯路,但是并不能100%保证一次编译成功,肯定会遇到各种问题,不过不要怕,遇到问题,根据错误提示,找解决方案,俗话说,兵来将挡水来土掩,培养解决问题的思路能力,比解决问题本身更重要。

----------------------------------------------------------------------------------

Building on Windows

----------------------------------------------------------------------------------
Requirements:

* Windows System
* JDK 1.8
* Maven 3.0 or later
* ProtocolBuffer 2.5.0
* CMake 3.1 or newer
* Visual Studio 2010 Professional or Higher
* Windows SDK 8.1 (if building CPU rate control for the container executor)
* zlib headers (if building native code bindings for zlib)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
* Unix command-line tools from GnuWin32: sh, mkdir, rm, cp, tar, gzip. These
  tools must be present on your PATH.
* Python ( for generation of docs using 'mvn site')

Unix command-line tools are also included with the Windows Git package which
can be downloaded from http://git-scm.com/downloads

If using Visual Studio, it must be Professional level or higher.
Do not use Visual Studio Express.  It does not support compiling for 64-bit,
which is problematic if running a 64-bit system.

The Windows SDK 8.1 is available to download at:

http://msdn.microsoft.com/en-us/windows/bg162891.aspx

Cygwin is not required.

----------------------------------------------------------------------------------
Building:

Keep the source code tree in a short path to avoid running into problems related
to Windows maximum path length limitation (for example, C:\hdc).

There is one support command file located in dev-support called win-paths-eg.cmd.
It should be copied somewhere convenient and modified to fit your needs.

win-paths-eg.cmd sets up the environment for use. You will need to modify this
file. It will put all of the required components in the command path,
configure the bit-ness of the build, and set several optional components.

Several tests require that the user must have the Create Symbolic Links
privilege.

All Maven goals are the same as described above with the exception that
native code is built by enabling the 'native-win' Maven profile. -Pnative-win
is enabled by default when building on Windows since the native components
are required (not optional) on Windows.

If native code bindings for zlib are required, then the zlib headers must be
deployed on the build machine. Set the ZLIB_HOME environment variable to the
directory containing the headers.

set ZLIB_HOME=C:\zlib-1.2.7

At runtime, zlib1.dll must be accessible on the PATH. Hadoop has been tested
with zlib 1.2.7, built using Visual Studio 2010 out of contrib\vstudio\vc10 in
the zlib 1.2.7 source tree.

http://www.zlib.net/

----------------------------------------------------------------------------------
Building distributions:

 * Build distribution with native code    : mvn package [-Pdist][-Pdocs][-Psrc][-Dtar][-Dmaven.javadoc.skip=true]

环境要求

  • jdk1.8
  • Maven 3.0 or later
  • ProtocolBuffer 2.5.0
  • CMake 3.1 or newer
  • Visual Studio 2010 Professional or Higher
  • Windows SDK 8.1 (if building CPU rate control for the container executor)
  • zlib headers (if building native code bindings for zlib)
  • Internet connection for first build (to fetch all Maven and Hadoop dependencies)
  • Unix command-line tools from GnuWin32: sh, mkdir, rm, cp, tar, gzip. These tools must be present on your PATH。(或者 git)
  • Python ( for generation of docs using ‘mvn site’)

环境准备

JDK安装

注意: JDK安装1.8版本

windows 下安装JDK参看:https://blog.csdn.net/xuejiaguniang/article/details/86331557

Maven安装

windows 下maven安装配置:https://blog.csdn.net/a805814077/article/details/100545928

ProtocolBuffer安装

注意: 只能是 ProtocolBuffer 2.5.0

需要的文件:

  • protobuf-2.5.0.tar.gz
  • protoc-2.5.0-win32.zip。

下载地址: https://download.csdn.net/download/u013501457/10209225

官方下载地址: https://github.com/protocolbuffers/protobuf/releases

解压protoc-2.5.0-win32.zip会得到一个protoc.exe文件;

解压protobuf-2.5.0.tar.gz,我的解压路径是D:\Java\protobuf\protobuf-2.5.0;

  • a) 将protoc.exe文件拷贝到C:\Windows\System32目录下;
  • b) 将protoc.exe拷贝到解压后的D:\Java\protobuf\protobuf-2.5.0\src目录中
  • c) 在windows的cmd中进入D:\Java\protobuf\protobuf-2.5.0\java 目录,执行 "mvn package"命令,开始编译,最终会在D:\Java\protobuf\protobuf-2.5.0\java\target目录下生成一个protobuf-java-2.5.0.jar包;
  • d) 如果命令行界面出现"BUILD SUCCESS"结果说明protobuf安装成功,使用"protoc --version"命令来查看安装是否成功:

在这里插入图片描述

CMake安装

windows 下cmake安装配置:https://blog.csdn.net/m0_37407756/article/details/79790417

Visual Studio安装

Windows SDK安装

zlib安装

GIT安装

git 安装 安装unix tools。

Unix command-line tools from GnuWin32: sh, mkdir, rm, cp, tar, gzip. These
  tools must be present on your PATH.

在这里插入图片描述

Python安装

编译

注意不要加 mvn 不要加clean,否则手动修改过的目录会被清理掉

mvn package [-Pdist][-Pdocs][-Psrc][-Dtar][-Dmaven.javadoc.skip=true]
mvn package -Dmaven.javadoc.skip=true -Dmaven.test.skip=true
mvn package -Pdist,native-win -DskipTests -Dtar -e -X
mvn package -Pdist,native-win -DskipTests -Dtar
mvn package -Pdist -DskipTests -Dtar -Dmaven.javadoc.skip=true
mvn package -Pdist -DskipTests -Dtar -Dmaven.javadoc.skip=true -rf :hadoop-common

错误解决

错误1

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part ...<exec failonerror="true" dir="D:\h\hadoop\hadoop-hdfs-project\hadoop-hdfs-native-client\target/native" executable="cmake">... @ 5:125 in D:\h\hadoop\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

解决方案

D:\h\hadoop\hadoop-hdfs-project\hadoop-hdfs-native-client\pom.xml这个pom.xml配置文件中failonerror="true"改为failonerror="false"

在这里插入图片描述

错误2

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs-native-client: An Ant BuildException has occured: D:\h\hadoop\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native\bin\RelWithDebInfo does not exist.
[ERROR] around Ant part ...<copy todir="D:\h\hadoop\hadoop-hdfs-project\hadoop-hdfs-native-client\target/bin">... @ 13:86 in D:\h\hadoop\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

解决方案

报目录D:\h\hadoop\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native\bin\RelWithDebInfo不存在,那就在自定目录下手动创建需要的目录。需要注意的是,执行mvn 编译的时候不能加clean,否则这个手动加的目录会被清理掉。

错误3

[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (convert-ms-winutils) on project hadoop-common: Command execution failed.: Process exited with an error: 1 (Exit value: 1) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (convert-ms-winutils) on project hadoop-common: Command execution failed.

解决方案

用Visual Studio 2019把D:\h\hadoop\hadoop-common-project\hadoop-common\src\main\native\native.sln重新打开一下,升级编译版本。

在这里插入图片描述


参考

  • 【Hadoop 3.2.1 win10 64位系统 vs2015 编译】https://www.cnblogs.com/bclshuai/p/12009991.html
  • 【Windows7-64编译hadoop-3.2.0】https://blog.csdn.net/MoodStreet/article/details/98972784
  • 【在windows下使用linux命令,GnuWin32的使用】https://www.cnblogs.com/cnsevennight/p/4253167.html
  • 1
    点赞
  • 8
    收藏
    觉得还不错? 一键收藏
  • 5
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 5
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值