- @Override
- public void readFields(DataInput in) throws IOException {
- uid = in.readLong();
- fansNum = in.readInt();
- followNum = in.readInt();
- feedNum = in.readInt();
- depth = in.readInt();
- fans.readFields(in);
- follow.readFields(in);
- feedList.readFields(in);
- //nick.readFields(in);
- nick = in.readUTF();
- }
- @Override
- public void write(DataOutput out) throws IOException {
- out.writeLong(uid);
- out.writeInt(fansNum);
- out.writeInt(followNum);
- out.writeInt(feedNum);
- out.writeInt(depth);
- fans.write(out);
- follow.write(out);
- feedList.write(out);
- out.writeUTF(nick);
- //nick.write(out);
- }
从map传递到reduce里,发现nick里出现了乱码。把出现乱码的文字拿出来单独测试却正常。难以索解。改成String类型后bug消失。
Hadoop版本0.20.2
本文转自 dogegg250 51CTO博客,原文链接:http://blog.51cto.com/jianshusoft/787240,如需转载请自行联系原作者