Java代码如下:
- FileSystem fs = FileSystem.get(conf);
- in = fs.open(new Path("hdfs://192.168.130.54:19000/user/hmail/output/part-00000"));
抛出异常如下:
- Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS: hdfs://192.168.130.54:19000/user/hmail/output/part-00000, expected: file:///
- at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
- at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:47)
- at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:357)
- at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)
- at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:125)
- at org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:283)
- at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:356)
- at com.netease.hadoop.HDFSCatWithAPI.main(HDFSCatWithAPI.java:23)
解决方案:
hadoop需要把集群上的core-site.xml和hdfs-site.xml放到当前工程下。
然后运行即可。
转载于:https://blog.51cto.com/xiaoxiancai/829875