MapReduce读取hdfs上文件,建立词频的倒排索引到Hbase
Hdfs上的数据文件为T0,T1,T2(无后缀):
T0:
What has come into being in him was life, and the life was the light of all people.
The light shines in the darkness, and the darkness did not overcome it. Enter through the narrow gate;
for the gate is wide and the road is easy that leads to destruction, and there are many who take it.
For the gate is narrow and the road is hard that leads to life, and there are few who find it
T1:
Where, O death, is your victory? Where, O death, is your sting? The sting of death is sin, and.
The power of sin is the law. But thanks be to God, who gives us the victory through our Lord Jesus Christ.
The grass withers, the flower fades, when the breath of the LORD blows upon it; surely the people are grass.
The grass withers, the flower fades; but the word of our God will stand forever.
T2:
What has come into being in him was life, and the life was the light of all people.
The light shines in the darkness, and the darkness did not overcome it. Enter through the narrow gate;
for the gate is wide and the road is easy that leads to destruction, and there are many who take it.
For the gate is narrow and the road is hard that leads to life, and there are few who find it.
实现代码如下:
package com.pro.bq; import java.io.IOException;
import java.util.StringTokenizer; import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
import org.apache.hadoop.hbase.mapreduce.TableReducer;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileSplit;
import org.apache.hadoop.util.GenericOptionsParser; public class DataFromHdfs {
public static class LocalMap extends Mapper<Object, Text, Text, Text>
{
private FileSplit split=null;
private Text keydata=null;
public void map(Object key, Text value,Context context)
throws IOException, InterruptedException { split=(FileSplit) context.getInputSplit();
StringTokenizer tokenStr=new StringTokenizer(value.toString());
while(tokenStr.hasMoreTokens())
{
String token=tokenStr.nextToken();
if(token.contains(",")|| token.contains(".")||token.contains(";")||token.contains("?"))
{
token=token.substring(0, token.length()-1);
}
String filePath=split.getPath().toString();
int index=filePath.indexOf("T");
keydata=new Text(token+":"+filePath.substring(index));
context.write(keydata, new Text("1"));
}
}
}
public static class LocalCombiner extends Reducer<Text, Text, Text, Text>
{ public void reduce(Text key, Iterable<Text> values,Context context)
throws IOException, InterruptedException {
int index=key.toString().indexOf(":");
Text keydata=new Text(key.toString().substring(0, index));
String filename=key.toString().substring(index+1);
int sum=0;
for(Text val:values)
{
sum++;
}
context.write(keydata, new Text(filename+":"+String.valueOf(sum)));
}
}
public static class TableReduce extends TableReducer<Text, Text, ImmutableBytesWritable>
{ public void reduce(Text key, Iterable<Text> values,Context context)
throws IOException, InterruptedException {
for(Text val:values)
{
int index=val.toString().indexOf(":");
String filename=val.toString().substring(0, index);
int sum=Integer.parseInt(val.toString().substring(index+1));
String row=key.toString();
Put put=new Put(Bytes.toBytes(key.toString()));
// put.add(Bytes.toBytes("word"), Bytes.toBytes("content"), Bytes.toBytes(key.toString()));
put.add(Bytes.toBytes("filesum"), Bytes.toBytes("filename"), Bytes.toBytes(filename));
put.add(Bytes.toBytes("filesum"), Bytes.toBytes("count"), Bytes.toBytes(String.valueOf(sum)));
context.write(new ImmutableBytesWritable(Bytes.toBytes(row)), put);
} }
}
public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException {
Configuration conf=new Configuration();
conf=HBaseConfiguration.create(conf);
// conf.set("hbase.zookeeper.quorum.", "localhost");
String hdfsPath="hdfs://localhost:9000/user/haduser/";
String[] argsStr=new String[]{hdfsPath+"input/reverseIndex"};
String[] otherArgs=new GenericOptionsParser(conf, argsStr).getRemainingArgs();
Job job=new Job(conf);
job.setJarByClass(DataFromHdfs.class); job.setMapperClass(LocalMap.class);
job.setCombinerClass(LocalCombiner.class);
job.setReducerClass(TableReduce.class); job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(Text.class);//combiner的输入和输出类型同map相同 //之前要新建"index"表,否则会报错
TableMapReduceUtil.initTableReducerJob("index", TableReduce.class, job); FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
System.exit(job.waitForCompletion(true)?0:1);
}
}
运行之前用Shell创建”index“表,命令:” create 'index','filensum' “
程序运行之后,再执行shell命令:" scan 'index' ",执行效果如下:
MapReduce读取hdfs上文件,建立词频的倒排索引到Hbase的更多相关文章
- SparkHiveContext和直接Spark读取hdfs上文件然后再分析效果区别
最近用spark在集群上验证一个算法的问题,数据量大概是一天P级的,使用hiveContext查询之后再调用算法进行读取效果很慢,大概需要二十多个小时,一个查询将近半个小时,代码大概如下: try: ...
- python读取hdfs上的parquet文件方式
在使用python做大数据和机器学习处理过程中,首先需要读取hdfs数据,对于常用格式数据一般比较容易读取,parquet略微特殊.从hdfs上使用python获取parquet格式数据的方法(当然也 ...
- impala删表,而hdfs上文件却还在异常处理
Impala/hive删除表,drop后,hdfs上文件却还在处理方法: 问题原因分析,如下如可以看出一个属组是hive,一个是impala,keberas账号登录hive用户无法删除impala用户 ...
- 用mapreduce读取hdfs数据到hbase上
hdfs数据到hbase过程 将HDFS上的文件中的数据导入到hbase中 实现上面的需求也有两种办法,一种是自定义mr,一种是使用hbase提供好的import工具 hbase先创建好表 cre ...
- 【Spark】Spark-shell案例——standAlone模式下读取HDFS上存放的文件
目录 可以先用local模式读取一下 步骤 一.先将做测试的数据上传到HDFS 二.开发scala代码 standAlone模式查看HDFS上的文件 步骤 一.退出local模式,重新进入Spark- ...
- spark读取hdfs上的文件和写入数据到hdfs上面
def main(args: Array[String]): Unit = { val conf = new SparkConf() conf.set("spark.master" ...
- shell脚本监控Flume输出到HDFS上文件合法性
在使用flume中发现由于网络.HDFS等其它原因,使得经过Flume收集到HDFS上得日志有一些异常,表现为: 1.有未关闭的文件:以tmp(默认)结尾的文件.加入存到HDFS上得文件应该是gz压缩 ...
- 使用JAVA API读取HDFS的文件数据出现乱码的解决方案
使用JAVA api读取HDFS文件乱码踩坑 想写一个读取HFDS上的部分文件数据做预览的接口,根据网上的博客实现后,发现有时读取信息会出现乱码,例如读取一个csv时,字符串之间被逗号分割 英文字符串 ...
- HDFS 上文件块的副本数设置
一.使用 setrep 命令来设置 # 设置 /javafx-src.zip 的文件块只存三份 hadoop fs -setrep /javafx-src.zip 二.文件块在磁盘上的路径 # 设置的 ...
随机推荐
- How to find and fix Bash Shell-shock vulnerability CVE-2014-6271 in unix like system
type command - env x='() { :;}; echo vulnerable' bash -c 'echo hello' in your terminal. if your sy ...
- POJ 2960 博弈论
题目链接: http://poj.org/problem?id=2960 S-Nim Time Limit: 2000MS Memory Limit: 65536K 问题描述 Arthur and h ...
- if in hlsl
seems that in HLSL_4, we can use if https://msdn.microsoft.com/en-us/library/bb313972(v=xnagamestudi ...
- iOS开发之runtime的运用-获取当前网络状态
之前写过runtime的一些东西,这次通过runtime获取一些苹果官方不想让你拿到的东西,比如,状态栏内部的控件属性.本文将通过runtime带你一步步拿到状态栏中显示网络状态的控件,然后通过监测该 ...
- Spring MVC学习问题记录
自2015年3月11日开始进行记录 day01 2015.03.11 问题1:Line 1 in XML document from URL is invalid; 今天出现了Content is n ...
- 20+个可重复使用的jQuery代码片段
jQuery已经成为任何web项目的重要组成部分.它为网站提供了交互性的通过移动HTML元素,创建自定义动画,处理事件,选择DOM元素,检索整个document ,让最终用户有一个更好的体验. 在这篇 ...
- Sublime Text 编辑器
1.从http://www.sublimetext.com/2 下载Sublime Text 2编辑器. 2.安装Package Control 管理器,用于管理和安装插件包. 下载地址:https: ...
- Guava官方文档-RateLimiter类
转载自并发编程网 – ifeve.com RateLimiter 从概念上来讲,速率限制器会在可配置的速率下分配许可证.如果必要的话,每个acquire() 会阻塞当前线程直到许可证可用后获取该许可证 ...
- LCA(最近公共祖先)离线算法Tarjan+并查集
本文来自:http://www.cnblogs.com/Findxiaoxun/p/3428516.html 写得很好,一看就懂了. 在这里就复制了一份. LCA问题: 给出一棵有根树T,对于任意两个 ...
- awk处理之案例四:sort加awk来过滤文本
编译环境 本系列文章所提供的算法均在以下环境下编译通过. [脚本编译环境]Federa 8,linux 2.6.35.6-45.fc14.i686 [处理器] Intel(R) Core(TM)2 Q ...