Hadoop通过C API访问HDFS文件系统

一 LIB HDFS接口简介

    hadoop filesystem apis是JAVA CLIENT API,HDFS并没有提供原生的c语言访问接口。但HDFS提供了基于C调用接口LIBHDFS,为C语言访问HDFS提供了很大的便利

   头文件:hdfs.h  --- ${HADOOP_HOME}/include

        

   库文件:libhdfs.so --{HADOOP_HOME}/lib/native

        

二 程序执行

    以apache hadoop官网示例代码为例:

#include "hdfs.h"   
#include <stdio.h>
#include <stdlib.h>
#include <string.h>     
int main(int argc, char **argv) 
{  
    hdfsFS fs = hdfsConnect("hostname", 9000);  
    const char* writePath = "/maxf/testfile1.txt";  
    hdfsFile writeFile = hdfsOpenFile(fs, writePath, O_WRONLY|O_CREAT, 0, 0, 0);  
    if(!writeFile) 
    {  
        fprintf(stderr, "Failed to open %s for writing!\n", writePath);  
        exit(-1);  
    }  
    char* buffer = "Hello, World!\n";  
    tSize num_written_bytes = hdfsWrite(fs, writeFile, (void*)buffer, strlen(buffer)+1);  
    if (hdfsFlush(fs, writeFile)) 
    {  
        fprintf(stderr, "Failed to 'flush' %s\n", writePath);   
        exit(-1);  
    }  
    hdfsCloseFile(fs, writeFile);  
    return 0;
}  
编译程序:gcc hdfs.c -o c_hdfs -I /bata/lzp/hadoop-2.7.1/include -L /bata/lzp/hadoop-2.7.1/lib/native -lhdfs

出现错误:
/usr/bin/ld: warning: libjvm.so, needed by /home/hadoop/hadoop-2.7.1/lib/native/libhdfs.so, not found (try using -rpath or -rpath-link)
/home/hadoop/hadoop-2.7.1/lib/native/libhdfs.so: undefined reference to `JNI_CreateJavaVM@SUNWprivate_1.1'
/home/hadoop/hadoop-2.7.1/lib/native/libhdfs.so: undefined reference to `JNI_GetCreatedJavaVMs@SUNWprivate_1.1'
collect2: ld returned 1 exit status

1./etc/ld.so.conf中添加libhdfs.so库文件路径,libjvm.so库文件路径

2.可通过ldconfig -v | grep hdfs查看设置的库文件路径是否生效

设置完成之后编译程序可以正常通过

三 代码执行

1.缺失commons-lang-2.6.jar

2.缺失commons-configuration-1.6.jarhadoop-common-2.7.1.jarcommons-logging-1.2.jar三个包


3.缺失guava-12.0.1.jar


4.缺失commons-collections-3.2.2.jarslf4j-api-1.6.1.jarslf4j-log4j12-1.6.1.jarlog4j-1.2.17.jarhadoop-hdfs-2.7.1.jarhadoop-auth-2.5.1.jar


5.缺失htrace-core-3.1.0-incubating.jar


还有好多包需要进行配置,丢了会有各种各样的NoClassDefFoundError错误,在这里就不一一列出了;

可参考以下配置文件进行配置:

export CLASSPATH=.:$HADOOP_HOME/lib/commons-lang-2.6.jar:$CLASSPATH
export CLASSPATH=.:$HADOOP_HOME/lib/commons-configuration-1.6.jar:$CLASSPATH
export CLASSPATH=.:$HADOOP_HOME/share/hadoop/common/hadoop-common-2.7.1.jar:$CLASSPATH
export CLASSPATH=.:$HADOOP_HOME/lib/commons-logging-1.2.jar:$CLASSPATH
export CLASSPATH=.:$HADOOP_HOME/lib/guava-12.0.1.jar:$CLASSPATH
export CLASSPATH=.:$HADOOP_HOME/lib/commons-collections-3.2.2.jar:$CLASSPATH
export CLASSPATH=.:$HADOOP_HOME/lib/slf4j-api-1.6.1.jar:$CLASSPATH
export CLASSPATH=.:$HADOOP_HOME/lib/slf4j-log4j12-1.6.1.jar:$CLASSPATH
export CLASSPATH=.:$HADOOP_HOME/share/hadoop/common/lib/log4j-1.2.17.jar:$CLASSPATH
export CLASSPATH=.:$HADOOP_HOME/share/hadoop/hdfs/hadoop-hdfs-2.7.1.jar:$CLASSPATH
export CLASSPATH=.:$HADOOP_HOME/lib/hadoop-auth-2.5.1.jar:$CLASSPATH
export CLASSPATH=.:$HADOOP_HOME/share/hadoop/common/lib/hadoop-auth-2.7.1.jar:$CLASSPATH
export CLASSPATH=.:$HADOOP_HOME/lib/htrace-core-3.1.0-incubating.jar:$CLASSPATH
export CLASSPATH=.:$HADOOP_HOME/lib/commons-cli-1.2.jar:$CLASSPATH
export CLASSPATH=.:$HADOOP_HOME/lib/protobuf-java-2.5.0.jar:$CLASSPATH
export CLASSPATH=.:$HADOOP_HOME/lib/commons-io-2.4.jar:$CLASSPATH
export CLASSPATH=.:/bata/lzp/maxf/apache-drill-1.12.0/jars/classb/javax.servlet-api-3.1.0.jar::$CLASSPATH

配置完成后即可运行c程序的bin文件


查看hdfs中生成的文件


  • 3
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值