MapReduce读取hdfs上文件,建立词频的倒排索引到Hbase
Hdfs上的数据文件为T0,T1,T2(无后缀):
T0:
What has come into being in him was life, and the life was the light of all people.
The light shines in the darkness, and the darkness did not overcome it. Enter through the narrow gate;
for the gate is wide and the road is easy that leads to destruction, and there are many who take it.
For the gate is narrow and the road is hard that leads to life, and there are few who find it
T1:
Where, O death, is your victory? Where, O death, is your sting? The sting of death is sin, and.
The power of sin is the law. But thanks be to God, who gives us the victory through our Lord Jesus Christ.
The grass withers, the flower fades, when the breath of the LORD blows upon it; surely the people are grass.
The grass withers, the flower fades; but the word of our God will stand forever.
T2:
What has come into being in him was life, and the life was the light of all people.
The light shines in the darkness, and the darkness did not overcome it. Enter through the narrow gate;
for the gate is wide and the road is easy that leads to destruction, and there are many who take it.
For the gate is narrow and the road is hard that leads to life, and there are few who find it.
实现代码如下:
package com.pro.bq; import java.io.IOException;
import java.util.StringTokenizer; import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
import org.apache.hadoop.hbase.mapreduce.TableReducer;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileSplit;
import org.apache.hadoop.util.GenericOptionsParser; public class DataFromHdfs {
public static class LocalMap extends Mapper<Object, Text, Text, Text>
{
private FileSplit split=null;
private Text keydata=null;
public void map(Object key, Text value,Context context)
throws IOException, InterruptedException { split=(FileSplit) context.getInputSplit();
StringTokenizer tokenStr=new StringTokenizer(value.toString());
while(tokenStr.hasMoreTokens())
{
String token=tokenStr.nextToken();
if(token.contains(",")|| token.contains(".")||token.contains(";")||token.contains("?"))
{
token=token.substring(0, token.length()-1);
}
String filePath=split.getPath().toString();
int index=filePath.indexOf("T");
keydata=new Text(token+":"+filePath.substring(index));
context.write(keydata, new Text("1"));
}
}
}
public static class LocalCombiner extends Reducer<Text, Text, Text, Text>
{ public void reduce(Text key, Iterable<Text> values,Context context)
throws IOException, InterruptedException {
int index=key.toString().indexOf(":");
Text keydata=new Text(key.toString().substring(0, index));
String filename=key.toString().substring(index+1);
int sum=0;
for(Text val:values)
{
sum++;
}
context.write(keydata, new Text(filename+":"+String.valueOf(sum)));
}
}
public static class TableReduce extends TableReducer<Text, Text, ImmutableBytesWritable>
{ public void reduce(Text key, Iterable<Text> values,Context context)
throws IOException, InterruptedException {
for(Text val:values)
{
int index=val.toString().indexOf(":");
String filename=val.toString().substring(0, index);
int sum=Integer.parseInt(val.toString().substring(index+1));
String row=key.toString();
Put put=new Put(Bytes.toBytes(key.toString()));
// put.add(Bytes.toBytes("word"), Bytes.toBytes("content"), Bytes.toBytes(key.toString()));
put.add(Bytes.toBytes("filesum"), Bytes.toBytes("filename"), Bytes.toBytes(filename));
put.add(Bytes.toBytes("filesum"), Bytes.toBytes("count"), Bytes.toBytes(String.valueOf(sum)));
context.write(new ImmutableBytesWritable(Bytes.toBytes(row)), put);
} }
}
public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException {
Configuration conf=new Configuration();
conf=HBaseConfiguration.create(conf);
// conf.set("hbase.zookeeper.quorum.", "localhost");
String hdfsPath="hdfs://localhost:9000/user/haduser/";
String[] argsStr=new String[]{hdfsPath+"input/reverseIndex"};
String[] otherArgs=new GenericOptionsParser(conf, argsStr).getRemainingArgs();
Job job=new Job(conf);
job.setJarByClass(DataFromHdfs.class); job.setMapperClass(LocalMap.class);
job.setCombinerClass(LocalCombiner.class);
job.setReducerClass(TableReduce.class); job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(Text.class);//combiner的输入和输出类型同map相同 //之前要新建"index"表,否则会报错
TableMapReduceUtil.initTableReducerJob("index", TableReduce.class, job); FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
System.exit(job.waitForCompletion(true)?0:1);
}
}
运行之前用Shell创建”index“表,命令:” create 'index','filensum' “
程序运行之后,再执行shell命令:" scan 'index' ",执行效果如下:
MapReduce读取hdfs上文件,建立词频的倒排索引到Hbase的更多相关文章
- SparkHiveContext和直接Spark读取hdfs上文件然后再分析效果区别
最近用spark在集群上验证一个算法的问题,数据量大概是一天P级的,使用hiveContext查询之后再调用算法进行读取效果很慢,大概需要二十多个小时,一个查询将近半个小时,代码大概如下: try: ...
- python读取hdfs上的parquet文件方式
在使用python做大数据和机器学习处理过程中,首先需要读取hdfs数据,对于常用格式数据一般比较容易读取,parquet略微特殊.从hdfs上使用python获取parquet格式数据的方法(当然也 ...
- impala删表,而hdfs上文件却还在异常处理
Impala/hive删除表,drop后,hdfs上文件却还在处理方法: 问题原因分析,如下如可以看出一个属组是hive,一个是impala,keberas账号登录hive用户无法删除impala用户 ...
- 用mapreduce读取hdfs数据到hbase上
hdfs数据到hbase过程 将HDFS上的文件中的数据导入到hbase中 实现上面的需求也有两种办法,一种是自定义mr,一种是使用hbase提供好的import工具 hbase先创建好表 cre ...
- 【Spark】Spark-shell案例——standAlone模式下读取HDFS上存放的文件
目录 可以先用local模式读取一下 步骤 一.先将做测试的数据上传到HDFS 二.开发scala代码 standAlone模式查看HDFS上的文件 步骤 一.退出local模式,重新进入Spark- ...
- spark读取hdfs上的文件和写入数据到hdfs上面
def main(args: Array[String]): Unit = { val conf = new SparkConf() conf.set("spark.master" ...
- shell脚本监控Flume输出到HDFS上文件合法性
在使用flume中发现由于网络.HDFS等其它原因,使得经过Flume收集到HDFS上得日志有一些异常,表现为: 1.有未关闭的文件:以tmp(默认)结尾的文件.加入存到HDFS上得文件应该是gz压缩 ...
- 使用JAVA API读取HDFS的文件数据出现乱码的解决方案
使用JAVA api读取HDFS文件乱码踩坑 想写一个读取HFDS上的部分文件数据做预览的接口,根据网上的博客实现后,发现有时读取信息会出现乱码,例如读取一个csv时,字符串之间被逗号分割 英文字符串 ...
- HDFS 上文件块的副本数设置
一.使用 setrep 命令来设置 # 设置 /javafx-src.zip 的文件块只存三份 hadoop fs -setrep /javafx-src.zip 二.文件块在磁盘上的路径 # 设置的 ...
随机推荐
- 【Python】多线程编程
1.thread模块 2.threading模块 3.Queue模块与多线程互斥 简介: thread和threading模块允许创建和管理线程,thread模块提供了基本的线程和锁的支持,而thre ...
- bzoj 1024 暴力深搜
我们直接暴力的深搜怎么切就行了, 每一刀切的方案只有横着和竖着,横竖又分在几等分点切, 因为要保证每个人的面积相同,所以比较好处理了,第几个几等分点就 分给这边几刀. /*************** ...
- bzoj 2743 树状数组离线查询
我们按照询问的右端点排序,然后对于每一个位置,记录同颜色 上一个出现的位置,每次将上上位置出现的+1,上次出现的-1,然后 用树状数组维护就好了 /************************** ...
- [工作积累] GCC 4.6 new[] operator内存对齐的BUG
对于用户没有定义dctor(包括其所有成员)的类来说, new CLASS[n] 可能会直接请求sizeof(CLASS)*n的空间. 而带有dctor的 类, 因为delete[]的时候要逐个调用析 ...
- Portlet之讲解
Portlet在Web门户上管理和显示的可插拔的用户界面组件.Portlet产生可以聚合到门户页面中的标记语言代码的片段,如HTML,XML等.通常,根据桌面隐喻,一个门户页面显示为一组互相不重叠的p ...
- [Bug]The maximum array length quota (16384) has been exceeded while reading XML data.
写在前面 在项目中,有客户反应无法正常加载组织结构树,弄了一个测试的程序,在日志中查看到如下信息: Error in deserializing body of reply message for o ...
- 一道题DP
Problem Description 小明明又被大威鱼抓住了,大威鱼把小明明关在地牢里,地牢由n * n 个房间组成,小明被困在地牢的最左上角的房间中,出口在最右下角,他想逃出这个诡异的地牢,但是他 ...
- HOWTO: Create native-looking iPhone/iPad applications from HTML, CSS and JavaScript
HOWTO: Create native-looking iPhone/iPad applications from HTML, CSS and JavaScript Though it's not ...
- 5种你未必知道的JavaScript和CSS交互的方法
随着浏览器不断的升级改进,CSS和JavaScript之间的界限越来越模糊.本来它们是负责着完全不同的功能,但最终,它们都属于网页前端技术,它们需要相互密切的合作.我们的网页中都有.js文件和.css ...
- 提权后获取linux root密码
提权后获取linux root密码 2011-09-09 10:45:25 我来说两句 收藏 我要投稿 在webbackdoor本身是root(可能性小的可怜)或通过某漏洞溢出 ...