解决办法:
job.setMapOutputKeyClass(IntWritable.class);
job.setMapOutputValueClass(Text.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
if (key.getClass() != keyClass) {
thrownew IOException("Type mismatch in key from map: expected "
+ keyClass.getName() + ", recieved "
+ key.getClass().getName());
}
if(value.getClass() != valClass) {
thrownew IOException("Type mismatch in value from map: expected "
+ valClass.getName() + ", recieved "
+ value.getClass().getName());
}
在MapOutputBuffer.collet()会检查实际传过来的Key Value 类型,和MapOutputBuffer初始化指定的KeyClass和ValueClass是否一样。
keyClass= (Class<K>)job.getMapOutputKeyClass();
valClass =(Class<V>)job.getMapOutputValueClass();
public Class<?> getOutputKeyClass() { return getClass("mapred.output.key.class", LongWritable.class, Object.class); 没用指定即采用默认的LongWritable }
public Class<?> getOutputKeyClass() { return getClass("mapred.output.key.class", LongWritable.class, Object.class); 没用指定即采用默认的LongWritable }
上述是直接在程序中修改,还有一种在命令行中指定key和value的类型。
-D mapred.mapoutput.key.class=org.apache.hadoop.io.LongWritable或者-jobconf mapred.mapoutput.key.class=org.apache.hadoop.io.LongWritable