Hadoop C访问

现在计划做通过C访问Hadoop,选择了LibHDFS,打算用Eclipse(V3.7.2) CDT和cygwin测试。
1.下载CDT,通过Eclipse Install New Software,在地址安装。
2.Cygwin安装开发库,包含
gcc, gcc-core,gcc-g++, gcc-mingw-core, gcc-mingw-g++, make , gdb, binutils
3.添加cygwin的bin目录到Windows的Path环境变量里
4.在Windows下执行gcc,运行报错,显示“访问被拒绝” (access denied),检查g++.exe, gcc.exe是否只有1k大小,如果是,那么文件是符号链接,那么把文件重命名一下(备份)。然后把g++-3.exe(或者g++-4.exe)拷贝一份,重命名为g++.exe。 把gcc-3.exe拷贝一份,重命名为gcc.exe,参加文章
5.编写代码

#include "hdfs.h" int main(int argc, char **argv) { if (argc != 4) { fprintf(stderr, "Usage: hdfs_write \n"); exit(-1); } hdfsFS fs = hdfsConnect("default", 0); if (!fs) { fprintf(stderr, "Oops! Failed to connect to hdfs!\n"); exit(-1); } const char* writeFileName = argv[1]; tSize fileTotalSize = strtoul(argv[2], NULL, 10); tSize bufferSize = strtoul(argv[3], NULL, 10); hdfsFile writeFile = hdfsOpenFile(fs, writeFileName, O_WRONLY, bufferSize, 0, 0); if (!writeFile) { fprintf(stderr, "Failed to open %s for writing!\n", writeFileName); exit(-2); } // data to be written to the file char* buffer = malloc(sizeof(char) * bufferSize); if (buffer == NULL) { return -2; } int i = 0; for (i = 0; i < bufferSize; ++i) { buffer[i] = 'a' + (i % 26); } // write to the file tSize nrRemaining; for (nrRemaining = fileTotalSize; nrRemaining > 0; nrRemaining -= bufferSize) { int curSize = (bufferSize < nrRemaining) ? bufferSize : (int) nrRemaining; hdfsWrite(fs, writeFile, (void*) buffer, curSize); } free(buffer); hdfsCloseFile(fs, writeFile); hdfsDisconnect (fs); return 0; }

6.增加环境变量HADOOP_HOME,在cygwin目录下执行

export HADOOP_HOME=/home/test/hadoop0.20.2或在/etc/profile文件中增加上述内容。

7.在cygwin里面执行

gcc writeHDFS.c -I ${HADOOP_HOME}/src/c++/libhdfs -I /usr/local/jdk/include -I/usr/local/jdk/include/win32 -L${HADOOP_HOME}/c++/Linux-i386-32/lib -lhdfs -o writeHDFS
8.在linux里面执行gcc writeHDFS.c -I ${HADOOP_HOME}/src/c++/libhdfs -I ${JAVA_HOME}/include -I ${JAVA_HOME}/include/linux -L${HADOOP_HOME}/c++/Linux-i386-32/lib -lhdfs -L${JAVA_HOME}/jre/lib/i386/client -ljvm -o writeHDFS
9.运行./writeHDFS老报错,提示error while loading shared libraries: libhdfs.so.0: cannot open shared object file: No such file or directory,由于hadoop的c++/Linux-i386-32/lib目录下libhdfs.so、libhdfs.so.0、libhdfs.so.0.0.0文件相同,先将libhdfs.so.0改为libhdfs.so.0.bak,再执行ln -s ./libhdfs.so.0.0.0 ./libhdfs.so.0然后运行sudo vi /etc/ld.so.conf添加如下内容/home/pc01/hadoop-1.0.1/c++/Linux-i386-32/lib /usr/local/lib/jdk1.7.0_03/jre/lib/i386/server再运行sudo /sbin/ldconfig -v



  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值