继承关系1

1. java.lang.Object
  |__ org.apache.hadoop.mapreduce.JobContext
           |__org.apache.hadoop.mapreduce.TaskAttemptContext
                   |__ org.apache.hadoop.mapreduce.TaskInputOutputContext<KEYIN,VALUEIN,KEYOUT,VALUEOUT>
                          |__org.apache.hadoop.mapreduce.MapContext<KEYIN,VALUEIN,KEYOUT,VALUEOUT>
                               |__ org.apache.hadoop.mapreduce.Mapper.Context
Description:
                        
public class Mapper.Context

extends MapContext<KEYIN,VALUEIN,KEYOUT,VALUEOUT>

Constructor Summary:
Mapper.Context(Configuration conf, TaskAttemptID taskid, RecordReader<KEYIN,VALUEIN> reader, RecordWriter<KEYOUT,VALUEOUT> writer, OutputCommitter committer, StatusReporter reporter, InputSplit split)
Method Summary:
  Methods inherited from class org.apache.hadoop.mapreduce.MapContext
getCurrentKey, getCurrentValue, getInputSplit, nextKeyValue
  Methods inherited from class org.apache.hadoop.mapreduce.TaskInputOutputContext
getCounter, getCounter, getOutputCommitter, progress, setStatus, write
  Methods inherited from class org.apache.hadoop.mapreduce.TaskAttemptContext
getStatus, getTaskAttemptID
  Methods inherited from class org.apache.hadoop.mapreduce.JobContext
getCombinerClass, getConfiguration, getCredentials, getGroupingComparator, getInputFormatClass, getJar, getJobID, getJobName, getMapOutputKeyClass, getMapOutputValueClass, getMapperClass, getNumReduceTasks, getOutputFormatClass, getOutputKeyClass, getOutputValueClass, getPartitionerClass, getReducerClass, getSortComparator, getWorkingDirectory
  Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
2. java.lang.Object
  org.apache.hadoop.mapreduce.JobContext
      |_ org.apache.hadoop.mapreduce.TaskAttemptContext
          |_ org.apache.hadoop.mapreduce.TaskInputOutputContext<KEYIN,VALUEIN,KEYOUT,VALUEOUT>
               |_ org.apache.hadoop.mapreduce.ReduceContext<KEYIN,VALUEIN,KEYOUT,VALUEOUT>
                   |_ org.apache.hadoop.mapreduce.Reducer.Context
Description:
public class Reducer.Contextextends ReduceContext<KEYIN,VALUEIN,KEYOUT,VALUEOUT>
Constructor Summary:
Reducer.Context(Configuration conf, TaskAttemptID taskid, RawKeyValueIterator input, Counter inputKeyCounter, Counter inputValueCounter, RecordWriter<KEYOUT,VALUEOUT> output, OutputCommitter committer, StatusReporter reporter, RawComparator<KEYIN> comparator, Class<KEYIN> keyClass, Class<VALUEIN> valueClass)
Method Summary:
  Methods inherited from class org.apache.hadoop.mapreduce.ReduceContext
getCurrentKey, getCurrentValue, getValues, nextKey, nextKeyValue
  Methods inherited from class org.apache.hadoop.mapreduce.TaskInputOutputContext
getCounter, getCounter, getOutputCommitter, progress, setStatus, write
  Methods inherited from class org.apache.hadoop.mapreduce.TaskAttemptContext
getStatus, getTaskAttemptID
  Methods inherited from class org.apache.hadoop.mapreduce.JobContext
getCombinerClass, getConfiguration, getCredentials, getGroupingComparator, getInputFormatClass, getJar, getJobID, getJobName, getMapOutputKeyClass, getMapOutputValueClass, getMapperClass, getNumReduceTasks, getOutputFormatClass, getOutputKeyClass, getOutputValueClass, getPartitionerClass, getReducerClass, getSortComparator, getWorkingDirectory
  Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait

继承关系2

Code

1.MaxTemperatureMapper.java

 import java.io.IOException;

 import org.apache.hadoop.mapreduce.Mapper;
 import org.apache.hadoop.io.LongWritable;
 import org.apache.hadoop.io.IntWritable;
 import org.apache.hadoop.io.Text;

 public class MaxTemperatureMapper
   extends Mapper<LongWritable, Text, Text, IntWritable> {

     @Override
     public void map(LongWritable key, Text, value, Context context)
       throws IOException, InterruptedExceptioin {

       String line = value.toString();
       String year = line.subString(15,19);
       int airTemperature = Integer.parseInt(line.subString(87,92));
       context.write(new Text(year), new IntWritable(airTemperature));
     }
 }

2.MaxTemperatureMapperTest.java

 import java.io.IOException;
 import org.apache.hadoop.io.LongWritable;
 import org.apache.hadoop.io.IntWritable;
 import org.apache.hadoop.io.Text;
 import org.junit.Test;
 import org.apache.hadoop.mrunit.mapreduce.MapDriver;

 public class MaxTemperatureMapperTest {

   @Test
   public void processesValidRecord() throws IOException {
     Text value = new Text("0043011990999991950051518004+68750+023550FM-12+0382" +
                                   // Year ^^^^
         "99999V0203201N00261220001CN9999999N9-00111+99999999999");
                               // Temperature ^^^^^
     new MapDriver<LongWritable, Text, Text, IntWritable>()
     .withMapper(new MaxTemperatureMapper())
     .withInput(new LongWritable(1), value)
     .withOutput(new Text("1950"), new IntWritable(-11))
     .runTest();
   }
 }

注意一些deprecated的class和methods:

org.apache.hadoop.mrunit.MapDriver<K1,V1,K2,V2>被弃用应该可以理解,此类是为mapreduce的旧API(比如org.apache.hadoop.mapred)写的,比如其中一个方法
 MapDriver<K1,V1,K2,V2> withMapper(org.apache.hadoop.mapred.Mapper<K1,V1,K2,V2> m)

mapreduce的新API为org.apache.hadoop.mapreduce.*; 与之对应MRUnit的MapDriver(包括ReduceDriver)为:

org.apache.hadoop.mrunit.mapreduce.MapDriver<K1,V1,K2,V2>同样的,上述方法变为:
MapDriver<K1,V1,K2,V2> withCounters(org.apache.hadoop.mapreduce.Counters ctrs)  

MapDriverBase class中的T withInputValue(V1 val) 被弃用,改为T withInput(K1 key, V1 val) ,还有很多,不详列。

执行步骤:

注意: 需要下载MRUnit并编译,在/home/user/.bashrc下设置MRUnit_HOME变量, 之后修改$HADOOP_HOME/libexec/hadoop-config.sh,将$MRUnit_HOME/lib/*.jar添加进去, 之后source $HADOOP_HOME/libexec/hadoop-config.sh,再执行下面操作:

javac  -d class/  MaxTemperatureMapper.java  MaxTemperatureMapperTest.java
jar -cvf test.jar -C class ./
java -cp test.jar:$CLASSPATH org.junit.runner.JUnitCore  MaxTemperatureMapperTest  # or
yarn -cp test.jar:$CLASSPATH org.junit.runner.JUnitCore  MaxTemperatureMapperTest

hadoop2.2编程:MRUnit——Test MaxTemperatureMapper的更多相关文章

  1. hadoop2.2编程:MRUnit测试

    引用地址:http://www.cnblogs.com/lucius/p/3442381.html examples: Overview This document explains how to w ...

  2. hadoop2.2编程:MRUnit

    examples: Overview This document explains how to write unit tests for your map reduce code, and test ...

  3. hadoop2.2编程:各种API

    hadoop2.2 API http://hadoop.apache.org/docs/r0.23.9/api/index.html junit API http://junit.org/javado ...

  4. hadoop2.2编程:使用MapReduce编程实例(转)

    原文链接:http://www.cnblogs.com/xia520pi/archive/2012/06/04/2534533.html 从网上搜到的一篇hadoop的编程实例,对于初学者真是帮助太大 ...

  5. hadoop2.2编程:DFS API 操作

    1. Reading data from a hadoop URL 说明:想要让java从hadoop的dfs里读取数据,则java 必须能够识别hadoop hdfs URL schema, 因此我 ...

  6. hadoop2.2编程: 重写comparactor

    要点: 类型比较在hadoop的mapreduce中非常重要,主要用来比较keys; hadoop中的RawComparator<T>接口继承自java的comparator, 主要用来比 ...

  7. hadoop2.2编程: SequenceFileWritDemo

    import java.io.IOException; import java.net.URI; import org.apache.hadoop.fs.FileSystem; import org. ...

  8. hadoop2.2编程:从default mapreduce program 来理解mapreduce

    下面写一个default mapreduce 的程序: import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapr ...

  9. Hadoop2.2编程:新旧API的区别

    Hadoop最新版本的MapReduce Release 0.20.0的API包括了一个全新的Mapreduce JAVA API,有时候也称为上下文对象. 新的API类型上不兼容以前的API,所以, ...

随机推荐

  1. Java validator整理

    Java validator整理 因为想对方法的入参和出参作简单的非空或者非空字符做校验,所以找了下相关的@NotNull注解 类 | 说明 --- | --- javax.validation.co ...

  2. C++ #pragma 预处理指令

    #pragma 预编译指令的作用是设定编译器的状态或者是指示编译器完成一些特定的动作.#pragma指令对每个编译器给出了一个方法,在保持与C和C++语言完全兼容的情况下,给出主机或操作系统专有的特征 ...

  3. WTL 中的常见问题汇总

    1.CRect,CPoint,CSize的使用 WTL提供了CString,CRect,CPoint和CSize,可能后来版本的ATL也提供了,WTL作者推荐使用ATL的实现,所以:#include ...

  4. 第12条:考虑实现Comparable接口

    CompareTo方法没有在Object中声明,它是Comparable接口中的唯一的方法,不但允许进行简单的等同性比较,而且允许执行顺序比较.类实现了Comparable接口,就表明它的实例具有内在 ...

  5. React:用于搭建UI的JavaScript库

    React https://facebook.github.io/react/index.html 2016-08-03 先吐槽一下.看过很多博客.教程.文章,一直想不通为什么大牛们介绍一种新技术一上 ...

  6. RX学习笔记:FreeCodeCamp的JavaScript基本算法挑战

    FreeCodeCamp的JavaScript基本算法挑战 https://www.freecodecamp.com 2016-07-03 JavaScript还不是非常熟悉,用已经会的知识来解这些题 ...

  7. jquery 100%全屏自适应宽可点击左右和焦点的自动切换幻灯片特效

    http://www.divcss5.com/css-texiao/texiao717.shtml http://d.divcss5.com/divcss5/down/2014062201.zip

  8. A-Frame 简介03

    如果你想开始使用A-Frame可以通过以下几种方式: Play with CodePen Grab the Boilerplate Include the JS Build Install from ...

  9. 51nod1240莫比乌斯函数

    莫比乌斯函数,由德国数学家和天文学家莫比乌斯提出.梅滕斯(Mertens)首先使用μ(n)(miu(n))作为莫比乌斯函数的记号.(据说,高斯(Gauss)比莫比乌斯早三十年就曾考虑过这个函数).   ...

  10. 解决Genemotion 安装出现“Unable to start......”的问题

    最近在用uiautomator做安卓自动化测试,由于没有测试设备,所以只好自己在电脑里面安装了一个GenyMotion模拟器,虽然速度不及真机,但是也算能解决大部分的需求. 安装完之后启动出现了以下错 ...