实现Hadoop的Writable接口Implementing Writable interface of Hadoop
As we saw in the previous posts, Hadoop makes an heavy use of network transmissions for executing its jobs. As Doug Cutting (the creator of Hadoop) explaines in this post on the Lucene mailing list, java.io.Serializable is too heavy for Hadoop's needs and so a new interface has been developed: Writable. Every object you need to emit from mapper to reducers or as an output has to implement this interface in order to make Hadoop trasmit the data from/to the nodes in the cluster.
Hadoop comes with several wrappers around primitive types and widely used classes in Java:
Java primitive | Writable implementation |
---|---|
boolean | BooleanWritable |
byte | ByteWritable |
short | ShortWritable |
int | IntWritable VIntWritable |
float | FloatWritable |
long | LongWritable VLongWritable |
double | DoubleWritable |
Java class | Writable implementation |
---|---|
String | Text |
byte[] | BytesWritable |
Object | ObjectWritable |
null | NullWritable |
Java collection | Writable implementation |
---|---|
array | ArrayWritable ArrayPrimitiveWritable TwoDArrayWritable |
Map | MapWritable |
SortedMap | SortedMapWritable |
enum | EnumWritable |
For example, if we need a mapper to emit a String, we need to use a Text object wrapped around the string we want to emit.
The interface Writable defines two methods:
- public void write(DataOutput dataOutput) throws IOException
- public void readFields(DataInput dataInput) throws IOException
The first method, write() is used for writing the data onto the stream, while the second method, readFields(), is used for reading data from the stream. The wrappers we saw above just send and receive their binary representation over a stream.
Since Hadoop needs also to sort data while in the shuffle-and-sort phase, it needs also the Comparable interface to be implemented, so it defines the WritableComparable interface which is an interface that implements both Writable and Comparable.
If we need to emit a custom object which has no default wrapper, we need to create a class that implements the WritableComparable interface. In the mean example we saw on this post, we used the SumCount class, which is a class that implements WritableComparable (the source code is available on github):
public class SumCount implements WritableComparable<SumCount> { DoubleWritable sum;
IntWritable count; public SumCount() {
set(new DoubleWritable(0), new IntWritable(0));
} public SumCount(Double sum, Integer count) {
set(new DoubleWritable(sum), new IntWritable(count));
} public void set(DoubleWritable sum, IntWritable count) {
this.sum = sum;
this.count = count;
} public DoubleWritable getSum() {
return sum;
} public IntWritable getCount() {
return count;
} public void addSumCount(SumCount sumCount) {
set(new DoubleWritable(this.sum.get() + sumCount.getSum().get()), new IntWritable(this.count.get() + sumCount.getCount().get()));
} @Override
public void write(DataOutput dataOutput) throws IOException { sum.write(dataOutput);
count.write(dataOutput);
} @Override
public void readFields(DataInput dataInput) throws IOException { sum.readFields(dataInput);
count.readFields(dataInput);
} @Override
public int compareTo(SumCount sumCount) { // compares the first of the two values
int comparison = sum.compareTo(sumCount.sum); // if they're not equal, return the value of compareTo between the "sum" value
if (comparison != 0) {
return comparison;
} // else return the value of compareTo between the "count" value
return count.compareTo(sumCount.count);
} @Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false; SumCount sumCount = (SumCount) o; return count.equals(sumCount.count) && sum.equals(sumCount.sum);
} @Override
public int hashCode() {
int result = sum.hashCode();
result = 31 * result + count.hashCode();
return result;
}
}
As we can see, it's very easy to code the two methods defined by the Writable interface: they just call the write() and readFields() method of the primitive types of the properties of SumCount class; it's important setting the properties in the same order in both read() and writeFields(), otherwise it will not work. The other methods of this class are the getters, setters and the methods needed by the Comparable interface, which should be nothing new to a Java developer.
from: http://andreaiacono.blogspot.com/2014/05/implementing-writable-interface-of.html
实现Hadoop的Writable接口Implementing Writable interface of Hadoop的更多相关文章
- 如何实现多个接口Implementing Multiple Interface
4.实现多个接口Implementing Multiple Interface 接口的优势:马克-to-win:类可以实现多个接口.与之相反,类只能继承一个超类(抽象类或其他类). A class c ...
- Hadoop中序列化与Writable接口
学习笔记,整理自<Hadoop权威指南 第3版> 一.序列化 序列化:序列化是将 内存 中的结构化数据 转化为 能在网络上传输 或 磁盘中进行永久保存的二进制流的过程:反序列化:序列化的逆 ...
- hadoop中的序列化与Writable接口
本文地址:http://www.cnblogs.com/archimedes/p/hadoop-writable-interface.html,转载请注明源地址. 简介 序列化和反序列化就是结构化对象 ...
- Hadoop序列化与Writable接口(一)
Hadoop序列化与Writable接口(一) 序列化 序列化(serialization)是指将结构化的对象转化为字节流,以便在网络上传输或者写入到硬盘进行永久存储:相对的反序列化(deserial ...
- Hadoop Serialization hadoop序列化详解(最新版) (1)【java和hadoop序列化比较和writable接口】
初学java的人肯定对java序列化记忆犹新.最开始很多人并不会一下子理解序列化的意义所在.这样子是因为很多人还是对java最底层的特性不是特别理解,当你经验丰富,对java理解更加深刻之后,你就会发 ...
- eclipse 提交作业到JobTracker Hadoop的数据类型要求必须实现Writable接口
问:在eclipse中的写的代码如何提交作业到JobTracker中的哪?答:(1)在eclipse中调用的job.waitForCompletion(true)实际上执行如下方法 connect() ...
- Hadoop基础-序列化与反序列化(实现Writable接口)
Hadoop基础-序列化与反序列化(实现Writable接口) 作者:尹正杰 版权声明:原创作品,谢绝转载!否则将追究法律责任. 一.序列化简介 1>.什么是序列化 序列化也称串行化,是将结构化 ...
- Hadoop序列化与Writable接口(二)
Hadoop序列化与Writable接口(二) 上一篇文章Hadoop序列化与Writable接口(一)介绍了Hadoop序列化,Hadoop Writable接口以及如何定制自己的Writable类 ...
- hadoop学习第四天-Writable和WritableComparable序列化接口的使用&&MapReduce中传递javaBean的简单例子
一. 为什么javaBean要继承Writable和WritableComparable接口? 1. 如果一个javaBean想要作为MapReduce的key或者value,就一定要实现序列化,因为 ...
随机推荐
- js获取光标位置并插入内容
先来几个网上找的参考资源,我爱互联网,互联网使我变得更加强大. https://blog.csdn.net/mafan121/article/details/78519348 详细篇,该作者很用心的解 ...
- USACO 6.1 A Rectangular Barn
A Rectangular Barn Mircea Pasoi -- 2003 Ever the capitalist, Farmer John wants to extend his milking ...
- VPS开启Google BBR
前言:系统环境为Ubuntu 18.04 修改系统变量: echo "net.core.default_qdisc=fq" >> /etc/sysctl.conf ec ...
- thinkphp条用函数与类库
手册上说的很冗余,没看懂,下面简单的讲一下具体用法. 函数调用: lib公共函数库叫 common.php App/common/common.php 分组模块下的公共函数库叫 function.ph ...
- js判断一个字符串是否是数字
function isNumber(val) { var regPos = /^\d+(\.\d+)?$/; //非负浮点数 var regNeg = /^(-(([0-9]+\.[0-9]*[1-9 ...
- Hadoop整理五(基于Hadoop的数据仓库Hive)
数据仓库,是为企业所有级别的决策制定过程,提供所有类型数据支持的战略集合.它是单个数据存储,出于分析性报告和决策支持目的而创建. 为需要业务智能的企业,提供指导业务流程改进.监视时间.成本.质量以及控 ...
- linux/mac下命令行rm回收站--rmtrash
Linux.mac的命令行下没有回收站功能,很多时候手一抖就把重要文件给 rm -fr * 了,虽然linux下有可能通过lost +found/debugfs找回,但难度也比较大,不能保证一定能够找 ...
- CodeForces - 612C Replace To Make Regular Bracket Sequence 压栈
C. Replace To Make Regular Bracket Sequence time limit per test 1 second memory limit per test 256 m ...
- JAVAEE——宜立方商城03:商品类目选择、Nginx端口或域名区分虚拟机、Nginx反向代理、负载均衡、keepalived实现高可用
1. 学习计划 第三天: 1.商品类目选择(EasyUI的tree实现) 2.图片上传 a) 图片服务器FastDFS(Nainx部分) 2. 商品类目选择 2.1. 原型 2.2. 功能分析 展示商 ...
- NetCore+Dapper WebApi架构搭建(五):Swagger构建WebApi界面
上一节讲解了仓储的依赖注入,想必现在都可以通过构造函数依赖注入直接调用 但是WebApi只是提供一个接口调用,为了方便我们的操作,我们得给他加上一个图形化界面工具,使用Swagger WebApi项目 ...