首先需要有Hadoop环境和java(sun-java6-jdk)。(此处不再详述)
1.环境需求
sudo apt-get install gcc g++ make gawk ant automake
2.安装Fuse
下载Fuse,这里我用的是Fuse-2.8.4
tar xvf Fuse-2.8.4.tar.gz
./configure --prefix=/opt/fuse --sbindir=/opt/fuse
make
sudo make install exec_prefix=/opt/fuse
3.在/etc/profile和~/.bashrc中写入相应环境变量
export JAVA_HOME=/usr/lib/jvm/java-6-sun
export HADOOP_HOME=/opt/hadoop-0.20.204.0
export FUSE_HOME=/opt/fuse
export PATH=$PATH:/opt/hadoop-0.20.204.0/bin:/usr/lib/jvm/java-6-sun-1.6.0.06/include/
export OS_ARCH=i386
export OS_BIT=32
export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:${HADOOP_HOME}/build/c++/Linux-$OS_ARCH-$OS_BIT/lib:/usr/local/lib:/usr/lib
source /etc/profile
source ~/.bashrc
4.编译libhdfs
cd $HADOOP_HOME
ant compile-c++-libhdfs -Dlibhdfs=1 -Dcompile.c++=1
ln -s c++/Linux-$OS_ARCH-$OS_BIT/lib build/libhdfs
ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1
并且
chmod +x ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh
chmod +x ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs
sudo ln -s ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh /usr/local/bin
sudo ln -s ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs /usr/local/bin/
5.进行挂载
sudo vim fuse_dfs_wrapper.sh//参照profile中的系统变量修改
cd
mkdir /hdfs
sudo fuse_dfs_wrapper.sh dfs://server1:8020 /hdfs
错误修复:
1.Hadoop本身的错误
sudo vim /opt/hadoop/src/c++/libhdfs/hdfs.h
//hdfsFS hdfsConnectAsUser(const char* host, tPort port, const char *user);
hdfsFS hdfsConnectAsUser(const char* host, tPort port, const char *user , const char *groups[], int groups_size );
2.Fuse_DFS本身的错误
sudo vim $HADOOP_HOME/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh
#./fuse_dfs $@
fuse_dfs $@
过程中可能出现的问题:
1../fuse_dfs_wrapper.sh: line 39: fuse_dfs: command not found
查看第4步中操作是否完成(到/usr/local/bin中看fuse_dfs_wrapper.sh和fuse_dfs是否正常)
chmod +x ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh
chmod +x ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs
sudo ln -s ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh /usr/local/bin
sudo ln -s ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs /usr/local/bin/
2../fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot open shared object file: No such file or directory
cd ${HADOOP_HOME}
mkdir libhdfs
cd libhdfs
cp ${HADOOP_HOME}/build/libhdfs/libhdfs.so* ./
并修改fuse_dfs_wrapper.sh中的LD_LIBRARY_PATH
export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:/opt/hadoop-0.20.204.0/libhdfs:/usr/local/lib:/usr/lib
注:目前对挂载节点操作可以直接反映到HDFS中(添加,复制,删除等等),但挂载节点不能呈现HDFS中文件目录。找到解决方法后我会将方法补充进来。
本文参考博客:http://mjhsieh7428.blogspot.tw/2011/07/hadoop-0203rc-hdfs-moun-by-fuse-ubuntu_06.html