Fuse-dfs on Ubuntu 11.04

4 篇文章 0 订阅
2 篇文章 0 订阅

首先需要有Hadoop环境和javasun-java6-jdk)。(此处不再详述)

1.环境需求

sudo apt-get install gcc g++ make gawk ant automake


2.安装Fuse

下载Fuse,这里我用的是Fuse-2.8.4

tar xvf Fuse-2.8.4.tar.gz  

./configure --prefix=/opt/fuse --sbindir=/opt/fuse


make


sudo make install exec_prefix=/opt/fuse

 

 

3./etc/profile~/.bashrc中写入相应环境变量

export JAVA_HOME=/usr/lib/jvm/java-6-sun

export HADOOP_HOME=/opt/hadoop-0.20.204.0


export FUSE_HOME=/opt/fuse


export PATH=$PATH:/opt/hadoop-0.20.204.0/bin:/usr/lib/jvm/java-6-sun-1.6.0.06/include/

export OS_ARCH=i386

export OS_BIT=32


export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:${HADOOP_HOME}/build/c++/Linux-$OS_ARCH-$OS_BIT/lib:/usr/local/lib:/usr/lib


source /etc/profile

source ~/.bashrc


 

4.编译libhdfs

cd $HADOOP_HOME

ant compile-c++-libhdfs -Dlibhdfs=1 -Dcompile.c++=1

ln -s c++/Linux-$OS_ARCH-$OS_BIT/lib build/libhdfs

ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1

 

 

并且

 

chmod +x ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh

chmod +x ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs

sudo ln -s ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh /usr/local/bin

sudo ln -s ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs /usr/local/bin/

 

5.进行挂载

sudo vim fuse_dfs_wrapper.sh//参照profile中的系统变量修改

cd

mkdir /hdfs

sudo fuse_dfs_wrapper.sh dfs://server1:8020 /hdfs


 
错误修复:

1.Hadoop本身的错误

sudo vim /opt/hadoop/src/c++/libhdfs/hdfs.h

//hdfsFS hdfsConnectAsUser(const char* host, tPort port, const char *user);

hdfsFS hdfsConnectAsUser(const char* host, tPort port, const char *user , const char *groups[], int groups_size );

 

2.Fuse_DFS本身的错误

sudo vim $HADOOP_HOME/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh

#./fuse_dfs $@

fuse_dfs $@


 

过程中可能出现的问题:

1../fuse_dfs_wrapper.sh: line 39: fuse_dfs: command not found

查看第4步中操作是否完成(到/usr/local/bin中看fuse_dfs_wrapper.sh和fuse_dfs是否正常)

 

chmod +x ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh

chmod +x ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs

sudo ln -s ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh /usr/local/bin

sudo ln -s ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs /usr/local/bin/


 

2../fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot open shared object file: No such file or directory

 

cd ${HADOOP_HOME} 

mkdir libhdfs

cd libhdfs

cp ${HADOOP_HOME}/build/libhdfs/libhdfs.so* ./

并修改fuse_dfs_wrapper.sh中的LD_LIBRARY_PATH

export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:/opt/hadoop-0.20.204.0/libhdfs:/usr/local/lib:/usr/lib


 


 

注:目前对挂载节点操作可以直接反映到HDFS中(添加,复制,删除等等),但挂载节点不能呈现HDFS中文件目录。找到解决方法后我会将方法补充进来。

本文参考博客:http://mjhsieh7428.blogspot.tw/2011/07/hadoop-0203rc-hdfs-moun-by-fuse-ubuntu_06.html

 

 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值