php httpfs链接hdfs

文章介绍了如何在PHP中使用michaelbutler/php-WebHDFS库进行WebHDFS操作,包括创建、上传、查看文件内容等,同时指出代码中的bug并提供了解决方案。作者详细列举了文件和目录的各种操作方法,如创建、写入、读取和权限设置等。
摘要由CSDN通过智能技术生成

一.代码(有bug)

GitHub - michaelbutler/php-WebHDFS: A PHP client for WebHDFS



二.调用代码

1.代码1.代码

require_once('../webhdfs/src/org/apache/hadoop/WebHDFS.php');require_once('../webhdfs/src/org/apache/hadoop/tools/Curl.php');

require_once('../webhdfs/src/org/apache/hadoop/WebHDFS/Exception.php'); $host = '192.168.168.126'; $hdfs = new org\apache\hadoop\WebHDFS(    $host,    '14000',    'hdfs',    'Master00',    '9000',    false); echo $hdfs->getHomeDirectory();  //echo "<br>";查看目录状态$response = $hdfs->listDirectories('/user/hdfs/');//上传$response = $hdfs->create('/user/hdfs/ls/composer.json','./webfs.txt'); //默认覆盖查看文件内容$response = $hdfs->open('/user/hdfs/ls/composer.json');echo "<br>";var_dump($response);

GitHub - michaelbutler/php-WebHDFS: A PHP client for WebHDFS

<?php

//luozhen@antiy.cn

require_once('../webhdfs/src/org/apache/hadoop/WebHDFS.php');

require_once('../webhdfs/src/org/apache/hadoop/tools/Curl.php');

require_once('../webhdfs/src/org/apache/hadoop/WebHDFS/Exception.php');

$host = '192.168.168.126';

$hdfs = new org\apache\hadoop\WebHDFS(    $host,    '14000',    'hdfs',    'Master00',    '9000',    false);

echo $hdfs->getHomeDirectory();

 查看目录状态

$response = $hdfs->listDirectories('/user/hdfs/');

//上传

$response = $hdfs->create('/user/hdfs/ls/composer.json','./webfs.txt'); //默认覆盖

查看文件内容

$response = $hdfs->open('/user/hdfs/ls/composer.json');

echo "<br>";var_dump($response);

GitHub - michaelbutler/php-WebHDFS: A PHP client for WebHDFS

四.代码有bug。当上传文件时,这个代码有bug

解决方式,加头即可。

四.调用说明,来自README.md

文件和目录操作

创建和写入文件
hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ hdfs - > create(' user / hadoop-username / new-file.txt ',' local-file.txt ');
直接创建内容并将其写入文件
hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ hdfs - > createWithData(' user / hadoop-username / new-file.txt ',' content ');
附加到文件
$ hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ hdfs - > append(' user / hadoop-username / file-to-append-to.txt ',' local-file.txt ');
Concat文件
$ hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ hdfs - > concat(' user / hadoop-username / concatenated-file.txt ',' / test / file1,/ test / file2,/ test / file3 ');
打开并读取文件
$ hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ response  =  $ hdfs - > open(' user / hadoop-username / file.txt ');
制作目录
$ hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ hdfs - > mkdirs(' user / hadoop-username / new / directory / structure ');
创建符号链接
$ hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ hdfs - > createSymLink(' user / hadoop-username / file.txt ',' / user / hadoop-username / symlink - to -file.txt ');
重命名文件/目录
$ hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ hdfs - > rename(' user / hadoop-username / file.txt ',' / user / hadoop-username / renamed -file.txt ');
删除文件/目录
hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ hdfs - > delete(' user / hadoop-username / file.txt ');
文件/目录的状态
$ hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ response  =  $ hdfs - > getFileStatus(' user / hadoop-username / file.txt ');
列出目录
$ hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ response  =  $ hdfs - > listStatus(' user / hadoop-username / ');

其他文件系统操作

获取目录的内容摘要
$ hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ response  =  $ hdfs - > getContentSummary(' user / hadoop-username / ');
获取文件校验和
$ hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ response  =  $ hdfs - > getFileChecksum(' user / hadoop-username / file.txt ');
获取主目录
$ hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ response  =  $ hdfs - > getHomeDirectory();
设置权限
$ hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ hdfs - > setPermission(' user / hadoop-username / file.txt ',' 777 ');
设置所有者
$ hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ hdfs - > setOwner(' user / hadoop-username / file.txt ',' other-user ');
设置复制因子
$ hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ hdfs - > setReplication(' user / hadoop-username / file.txt ',' 2 ');
设置访问或修改时间
$ hdfs  =  new  WebHDFS(' mynamenode.hadoop.com ',' 50070 ',' hadoop-username '); 
$ response  =  $ hdfs - > setTimes(' user / hadoop-username / file.txt ');
  • 19
    点赞
  • 26
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值