Hadoop 2.6.3运行自带WordCount程序笔记
运行平台:Hadoop 2.6.3
模式:完全分布模式
1、准备统计文本,以一段文字为例:eg.txt
The Project Gutenberg EBook of War and Peace, by Leo Tolstoy This eBook is for the use of anyone anywhere at no cost and with almost
no restrictions whatsoever. You may copy it, give it away or re-use it
under the terms of the Project Gutenberg License included with this
eBook or online at www.gutenberg.org Title: War and Peace Author: Leo Tolstoy
2、在Shell中上传文本
hadoop fs -put ./eg.txt /
3、进入share/hadoop/mapreduce目录下,启动排序
hadoop jar hadoop-mapreduce-examples-2.6..jar wordcount /eg.txt /out
4、屏幕输出结果如下:
16/03/29 21:30:26 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
16/03/29 21:30:30 INFO input.FileInputFormat: Total input paths to process : 1
16/03/29 21:30:30 INFO mapreduce.JobSubmitter: number of splits:1
16/03/29 21:30:31 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1459233715960_0004
16/03/29 21:30:31 INFO impl.YarnClientImpl: Submitted application application_1459233715960_0004
16/03/29 21:30:31 INFO mapreduce.Job: The url to track the job: http://m1.fredlab.org:8088/proxy/application_1459233715960_0004/
16/03/29 21:30:31 INFO mapreduce.Job: Running job: job_1459233715960_0004
16/03/29 21:30:47 INFO mapreduce.Job: Job job_1459233715960_0004 running in uber mode : false
16/03/29 21:30:47 INFO mapreduce.Job: map 0% reduce 0%
16/03/29 21:30:57 INFO mapreduce.Job: map 100% reduce 0%
16/03/29 21:31:09 INFO mapreduce.Job: map 100% reduce 100%
16/03/29 21:31:10 INFO mapreduce.Job: Job job_1459233715960_0004 completed successfully
16/03/29 21:31:11 INFO mapreduce.Job: Counters: 49
File System Counters
FILE: Number of bytes read=547
FILE: Number of bytes written=213761
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=453
HDFS: Number of bytes written=361
HDFS: Number of read operations=6
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Job Counters
Launched map tasks=1
Launched reduce tasks=1
Data-local map tasks=1
Total time spent by all maps in occupied slots (ms)=7594
Total time spent by all reduces in occupied slots (ms)=9087
Total time spent by all map tasks (ms)=7594
Total time spent by all reduce tasks (ms)=9087
Total vcore-milliseconds taken by all map tasks=7594
Total vcore-milliseconds taken by all reduce tasks=9087
Total megabyte-milliseconds taken by all map tasks=7776256
Total megabyte-milliseconds taken by all reduce tasks=9305088
Map-Reduce Framework
Map input records=11
Map output records=62
Map output bytes=598
Map output materialized bytes=547
Input split bytes=98
Combine input records=62
Combine output records=45
Reduce input groups=45
Reduce shuffle bytes=547
Reduce input records=45
Reduce output records=45
Spilled Records=90
Shuffled Maps =1
Failed Shuffles=0
Merged Map outputs=1
GC time elapsed (ms)=310
CPU time spent (ms)=2010
Physical memory (bytes) snapshot=273182720
Virtual memory (bytes) snapshot=4122341376
Total committed heap usage (bytes)=137498624
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=355
File Output Format Counters
Bytes Written=361
5、结果文件位于hadoop集群/out目录下,如果执行成功,则出现_SUCCESS标识文件,并将结果存放于part-r-00000文件中。
Author: 1
EBook 1
Gutenberg 2
Leo 2
License 1
Peace 1
Peace, 1
Project 2
The 1
This 1
Title: 1
Tolstoy 2
War 2
You 1
almost 1
and 3
anyone 1
anywhere 1
at 2
away 1
by 1
copy 1
cost 1
eBook 2
for 1
give 1
included 1
is 1
it 2
it, 1
may 1
no 2
of 3
online 1
or 2
re-use 1
restrictions 1
terms 1
the 3
this 1
under 1
use 1
whatsoever. 1
with 2
www.gutenberg.org 1
可以到http://www.gutenberg.org/上下载更多txt版书籍文本来练习。
Hadoop 2.6.3运行自带WordCount程序笔记的更多相关文章
- hadoop2.2使用手册2:如何运行自带wordcount
问题导读:1.hadoop2.x自带wordcount在什么位置?2.运行wordcount程序,需要做哪些准备? 此篇是在hadoop2完全分布式最新高可靠安装文档 hadoop2.X使用手册1:通 ...
- 大数据之路week07--day03(Hadoop深入理解,JAVA代码编写WordCount程序,以及扩展升级)
什么是MapReduce 你想数出一摞牌中有多少张黑桃.直观方式是一张一张检查并且数出有多少张是黑桃. MapReduce方法则是: 1.给在座的所有玩家中分配这摞牌 2.让每个玩家数自己手中的牌有几 ...
- hadoop:如何运行自带wordcount
1.在linux系统创建文件 vi aa.txt --------i 进行编辑 输入 内容(多个单词例如:aa bb cc aa) 2.在HDFS上面创建文件夹 hdfs dfs -mkdir ...
- MapReduce编程入门实例之WordCount:分别在Eclipse和Hadoop集群上运行
上一篇博文如何在Eclipse下搭建Hadoop开发环境,今天给大家介绍一下如何分别分别在Eclipse和Hadoop集群上运行我们的MapReduce程序! 1. 在Eclipse环境下运行MapR ...
- Hadoop下WordCount程序
一.前言 在之前我们已经在 CenOS6.5 下搭建好了 Hadoop2.x 的开发环境.既然环境已经搭建好了,那么现在我们就应该来干点正事嘛!比如来一个Hadoop世界的HelloWorld,也就是 ...
- Hadoop入门 完全分布式运行模式-集群配置
目录 集群配置 集群部署规划 配置文件说明 配置集群 群起集群 1 配置workers 2 启动集群 总结 3 集群基本测试 上传文件到集群 查看数据真实存储路径 下载 执行wordcount程序 配 ...
- spark wordcount程序
spark wordcount程序 IllegalAccessError错误 这个错误是权限错误,错误的引用方法,比如方法中调用private,protect方法. 当然大家知道wordcount业务 ...
- Hadoop_05_运行 Hadoop 自带 MapReduce程序
1. MapReduce使用 MapReduce是Hadoop中的分布式运算编程框架,只要按照其编程规范,只需要编写少量的业务逻辑代码即可实现 一个强大的海量数据并发处理程序 2. 运行Hadoop自 ...
- 020_自己编写的wordcount程序在hadoop上面运行,不使用插件hadoop-eclipse-plugin-1.2.1.jar
1.Eclipse中无插件运行MP程序 1)在Eclipse中编写MapReduce程序 2)打包成jar包 3)使用FTP工具,上传jar到hadoop 集群环境 4)运行 2.具体步骤 说明:该程 ...
随机推荐
- CISCO的HTTP/HTTPS/SSH配置测试完成
按实验一步一步,倒是很容易的,也理解罗~~ START-CONFIG粗配置文件如下: r1#show run Building configuration... Current configurati ...
- go网络编程示例,客户端,服务器端
http://blog.csdn.net/wangningyu/article/details/22859245 http://blog.csdn.net/wangningyu/article/det ...
- 剖析Qt的事件机制原理
版权声明 请尊重原创作品.转载请保持文章完整性,并以超链接形式注明原始作者“tingsking18”和主站点地址,方便其他朋友提问和指正. QT源码解析(一) QT创建窗口程序.消息循环和WinMai ...
- Linux协议栈函数调用流程
普通网络驱动程序中必须要调用的函数是eth_type_trans(略),然后向上递交sk_buff时调用netif_rx()(net/core/dev.c).其函数中主要几行 __skb_queue_ ...
- [转]web调试工具总结(firebug/fidder/httpwatch/wireshark)
ONE:Firebug: Firebug是网页浏览器 Mozilla Firefox下的一款开发类插件, 现属于Firefox的五星级强力推荐插件之一.它集HTML查看和编辑.Javascript控制 ...
- JavaScript---网络编程(9-2)--DHTML技术演示(2-2)-表格加强
对上篇博客的最后那个表格隔行高亮显示加了个功能,鼠标监听和年龄从小到大排序. 演示代码: <html> <head> <title>DHTML技术演示---表格中页 ...
- 门面模式 到 socket
http://www.cnblogs.com/java-my-life/archive/2012/05/02/2478101.html 1.门面模式定义: 门面模式是对象的结构模式,外部与一个子系统的 ...
- JSP控制select不可再选择
首先分析下disable ,display和readonly: 1,Readonly只针对input(text / password)和textarea有效,而disabled对于所有的表单元素都有效 ...
- (DT系列五)Linux kernel 是怎么将 devicetree中的内容生成plateform_device
Linux kernel 是怎么将 devicetree中的内容生成plateform_device 1,实现场景(以Versatile Express V2M为例说明其过程)以arch/arm/ma ...
- mv、umask、chattr、lsattr命令
mv命令行,即move 将文件移动到目录下 对文件或目录重命名 umask chattr 设置文件或目录的隐藏属性 lsattr显示文件或目录的隐藏属性 ls mv 1.txt aa ls cd aa ...