hadoop libhdfs的使用方法

博客说明:
博客内容用于学习与分享,有问题欢迎大家讨论留言。

关于作者:
程序员:杨洪(ellende)
blog: http://blog.csdn.net/ellende
email: yangh.personal@qq.com

转载请注明出处,引用部分网上博客,若有侵权还请作者联系与我。


本文主要介绍libhdfs.so 使用


1.使用环境如下:

hadoop 2.7.2

idk 1.7

centos


2.libhdfs.so使用

该共享库在hadoop-2.7.2/lib/native/libhdfs.so

还需要jdk的一个库jdk1.7.0_79/jre/lib/amd64/server/libjvm.so

hdfs.h头文件在
hadoop-2.7.2/include/hdfs.h


3.设置共享库路径

vim  /etc/ld.so.conf

修改如下:

/home/yj/HadoopFile/jdk1.7.0_79/jre/lib
/home/yj/HadoopFile/jdk1.7.0_79/jre/lib/amd64/server
/home/yj/HadoopFile/hadoop-2.7.2/lib/native

然后调用linux系统指令:

ldconfig


4.设置CLASSPATH不然会报NoClassDefFoundError exception: ExceptionUtils的错误

vim ~/.profile

修改如下:

CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JAVA_HOME/lib:$JAVA_HOME/jre/lib:$CLASSPATH
CLASSPATH=`$HADOOP_INSTALL/bin/hadoop classpath`:$CLASSPATH
CLASSPATH=`find $HADOOP_INSTALL/share/hadoop/ -name *.jar | awk '{printf("%s:", $0);}'`$CLASSPATH

extern CLASSPATH

启用source ~/.profile


5.编写c文件

vim  above_sample.c

内容如下:

#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include "hdfs.h"


int main(int argc, char** argv)
{
printf("start main!\n");


//hdfsFS fs = hdfsConnect("default", 0);  //error!
hdfsFS fs = hdfsConnectAsUser("localhost", 9000, "hadoop");


//const char* writePath = "./tempfile/testfile.txt";   //at /user/hadoop/tempfile/testfile.txt
const char* writePath = "hdfs://localhost:9000/tmp/testfile.txt";  //at /tmp/testfile.txt


printf("open file!\n");
hdfsFile writeFile = hdfsOpenFile(fs, writePath, O_WRONLY|O_CREAT, 0, 0, 0);
if (!writeFile)
{
fprintf(stderr, "Failed to open %s for writing!\n", writePath);
exit(-1);
}


printf("write file!\n");
char *buffer = "Hello, World\n";
tSize num_written_bytes = hdfsWrite(fs, writeFile, (void*)buffer, strlen(buffer)+1);
if (hdfsFlush(fs, writeFile))
{
fprintf(stderr, "Failed to 'flush' %s\n", writePath);
exit(-1);
}


hdfsCloseFile(fs, writeFile);
printf("close file!\n");


return 0;
}

6.编译

gcc ./above_sample.c  -I ../hadoop-2.7.2/include/ -L ../hadoop-2.7.2/lib/native/ -lhdfs -L ../jdk1.7.0_79/jre/lib/amd64/server/ -ljvm  -o above_sample

7.运行

./above_sample

提示如下:

gcc ./above_sample.c  -I ../hadoop-2.7.2/include/ -L ../hadoop-2.7.2/lib/native/ -lhdfs -L ../jdk1.7.0_79/jre/lib/amd64/server/ -ljvm  -o above_sample
./above_sample
start main!
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/yj/HadoopFile/hadoop-2.7.2/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/yj/HadoopFile/hadoop-2.7.2/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/yj/HadoopFile/hadoop-2.7.2/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2016-07-31 01:03:33,451 WARN  util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
open file!
write file!
close file!

通过hadoop fs -cat /tmp/testfile.txt

Hello, World


运行成功!



  • 1
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值