1、下载hadoop-eclipse-plugin-1.2.1.jar,并将之复制到eclipse/plugins下。

2、打开map-reduce视图

在eclipse中,打开window——>open perspetive——>other,选择map/reduce。

3、选择Map/Reduce Locations标签页,新建一个Location

4、在project exploer中,可以浏览刚才定义站点的文件系统

5、准备测试数据,并上传到hdfs中。

liaoliuqingdeMacBook-Air:Downloads liaoliuqing$ hadoop fs -mkdir in

liaoliuqingdeMacBook-Air:Downloads liaoliuqing$ hadoop fs -copyFromLocal maxTemp.txt in

liaoliuqingdeMacBook-Air:Downloads liaoliuqing$ hadoop fs -ls in

Found 1 items

-rw-r--r--   1 liaoliuqing supergroup        953 2014-12-14 09:47 /user/liaoliuqing/in/maxTemp.txt

其中maxTemp.txt的内容如下:

123456798676231190101234567986762311901012345679867623119010123456798676231190101234561+00121534567890356

123456798676231190101234567986762311901012345679867623119010123456798676231190101234562+01122934567890456

123456798676231190201234567986762311901012345679867623119010123456798676231190101234562+02120234567893456

123456798676231190401234567986762311901012345679867623119010123456798676231190101234561+00321234567803456

123456798676231190101234567986762311902012345679867623119010123456798676231190101234561+00429234567903456

123456798676231190501234567986762311902012345679867623119010123456798676231190101234561+01021134568903456

123456798676231190201234567986762311902012345679867623119010123456798676231190101234561+01124234578903456

123456798676231190301234567986762311905012345679867623119010123456798676231190101234561+04121234678903456

123456798676231190301234567986762311905012345679867623119010123456798676231190101234561+00821235678903456

6、准备map-reduce程序

程序请见http://blog.csdn.net/jediael_lu/article/details/37596469

7、运行程序

MaxTemperature.java——>run as——>run configuration

在arguments中填入输入及输出目录,开始run。

此处是在hdfs中运行程序,事实上也可以在本地文件系统中运行程序,此方法可以方便的用于程序调试。

如在参数中填入:

/Users/liaoliuqing/in   /Users/liaoliuqing/out

即可。

8、以下是eclise console中的输出内容

14/12/14 10:52:05 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

14/12/14 10:52:05 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.

14/12/14 10:52:05 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).

14/12/14 10:52:05 INFO input.FileInputFormat: Total input paths to process : 1

14/12/14 10:52:05 WARN snappy.LoadSnappy: Snappy native library not loaded

14/12/14 10:52:06 INFO mapred.JobClient: Running job: job_local1815770300_0001

14/12/14 10:52:06 INFO mapred.LocalJobRunner: Waiting for map tasks

14/12/14 10:52:06 INFO mapred.LocalJobRunner: Starting task: attempt_local1815770300_0001_m_000000_0

14/12/14 10:52:06 INFO mapred.Task:  Using ResourceCalculatorPlugin : null

14/12/14 10:52:06 INFO mapred.MapTask: Processing split: hdfs://localhost:9000/user/liaoliuqing/in/maxTemp.txt:0+953

14/12/14 10:52:06 INFO mapred.MapTask: io.sort.mb = 100

14/12/14 10:52:06 INFO mapred.MapTask: data buffer = 79691776/99614720

14/12/14 10:52:06 INFO mapred.MapTask: record buffer = 262144/327680

14/12/14 10:52:06 INFO mapred.MapTask: Starting flush of map output

14/12/14 10:52:06 INFO mapred.MapTask: Finished spill 0

14/12/14 10:52:06 INFO mapred.Task: Task:attempt_local1815770300_0001_m_000000_0 is done. And is in the process of commiting

14/12/14 10:52:06 INFO mapred.LocalJobRunner:

14/12/14 10:52:06 INFO mapred.Task: Task 'attempt_local1815770300_0001_m_000000_0' done.

14/12/14 10:52:06 INFO mapred.LocalJobRunner: Finishing task: attempt_local1815770300_0001_m_000000_0

14/12/14 10:52:06 INFO mapred.LocalJobRunner: Map task executor complete.

14/12/14 10:52:06 INFO mapred.Task:  Using ResourceCalculatorPlugin : null

14/12/14 10:52:06 INFO mapred.LocalJobRunner:

14/12/14 10:52:06 INFO mapred.Merger: Merging 1 sorted segments

14/12/14 10:52:06 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 90 bytes

14/12/14 10:52:06 INFO mapred.LocalJobRunner:

14/12/14 10:52:06 INFO mapred.Task: Task:attempt_local1815770300_0001_r_000000_0 is done. And is in the process of commiting

14/12/14 10:52:06 INFO mapred.LocalJobRunner:

14/12/14 10:52:06 INFO mapred.Task: Task attempt_local1815770300_0001_r_000000_0 is allowed to commit now

14/12/14 10:52:06 INFO output.FileOutputCommitter: Saved output of task 'attempt_local1815770300_0001_r_000000_0' to hdfs://localhost:9000/user/liaoliuqing/out

14/12/14 10:52:06 INFO mapred.LocalJobRunner: reduce > reduce

14/12/14 10:52:06 INFO mapred.Task: Task 'attempt_local1815770300_0001_r_000000_0' done.

14/12/14 10:52:07 INFO mapred.JobClient:  map 100% reduce 100%

14/12/14 10:52:07 INFO mapred.JobClient: Job complete: job_local1815770300_0001

14/12/14 10:52:07 INFO mapred.JobClient: Counters: 19

14/12/14 10:52:07 INFO mapred.JobClient:   File Output Format Counters

14/12/14 10:52:07 INFO mapred.JobClient:     Bytes Written=43

14/12/14 10:52:07 INFO mapred.JobClient:   File Input Format Counters

14/12/14 10:52:07 INFO mapred.JobClient:     Bytes Read=953

14/12/14 10:52:07 INFO mapred.JobClient:   FileSystemCounters

14/12/14 10:52:07 INFO mapred.JobClient:     FILE_BYTES_READ=450

14/12/14 10:52:07 INFO mapred.JobClient:     HDFS_BYTES_READ=1906

14/12/14 10:52:07 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=135618

14/12/14 10:52:07 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=43

14/12/14 10:52:07 INFO mapred.JobClient:   Map-Reduce Framework

14/12/14 10:52:07 INFO mapred.JobClient:     Reduce input groups=5

14/12/14 10:52:07 INFO mapred.JobClient:     Map output materialized bytes=94

14/12/14 10:52:07 INFO mapred.JobClient:     Combine output records=0

14/12/14 10:52:07 INFO mapred.JobClient:     Map input records=9

14/12/14 10:52:07 INFO mapred.JobClient:     Reduce shuffle bytes=0

14/12/14 10:52:07 INFO mapred.JobClient:     Reduce output records=5

14/12/14 10:52:07 INFO mapred.JobClient:     Spilled Records=16

14/12/14 10:52:07 INFO mapred.JobClient:     Map output bytes=72

14/12/14 10:52:07 INFO mapred.JobClient:     Total committed heap usage (bytes)=329252864

14/12/14 10:52:07 INFO mapred.JobClient:     SPLIT_RAW_BYTES=118

14/12/14 10:52:07 INFO mapred.JobClient:     Map output records=8

14/12/14 10:52:07 INFO mapred.JobClient:     Combine input records=0

14/12/14 10:52:07 INFO mapred.JobClient:     Reduce input records=8

在Eclipse中运行hadoop程序的更多相关文章

  1. Ubuntu下Eclipse中运行Hadoop程序的参数问题

    需要统一的参数: 当配置好eclipse中hadoop的程序后,几个参数需要统一一下: hadoop安装目录下/etc/core_site.xml中 fs.default.name的端口号一定要与ha ...

  2. 在Eclipse中运行hadoop程序 分类: A1_HADOOP 2014-12-14 11:11 624人阅读 评论(0) 收藏

    1.下载hadoop-eclipse-plugin-1.2.1.jar,并将之复制到eclipse/plugins下. 2.打开map-reduce视图 在eclipse中,打开window--> ...

  3. 【爬坑】在 IDEA 中运行 Hadoop 程序 报 winutils.exe 不存在错误解决方案

    0. 问题说明 环境为 Windows 10 在 IDEA 中运行 Hadoop 程序报   winutils.exe 不存在  错误 1. 解决方案 [1.1 解压] 解压 hadoop-2.7.3 ...

  4. eclipse中运行java程序

    1 package ttt; public class Testttt { public static void main() { Person p =new Person(); p.name=&qu ...

  5. 关于在Eclipse上运行Hadoop程序的日志输出问题

    在安装由Eclipse-Hadoop-Plugin的Eclipse中, 可以直接运行Hadoop的MapReduce程序, 但是如果什么都不配置的话你发现Eclipse控制台没有任何日志输出, 这个问 ...

  6. 关于在Eclipse中运行java程序报出:The project:XXXX which is referenced by the classpath10

    1.work_space名称与project是否一样,如果是一样的可能会导致错误. 2.project所在的文件夹中的.mymetadata文件中定义的project-module名称是否与proje ...

  7. Ubuntu下eclipse中运行Hadoop时所需要的JRE与JDK的搭配

    第一组: Eclise 版本:Indigo,Service Release 1 Build id:20110916-0149 Window-->Preferences -->Compile ...

  8. 使用Eclipse编译运行MapReduce程序 Hadoop2.6.0_Ubuntu/CentOS

    使用Eclipse编译运行MapReduce程序 Hadoop2.6.0_Ubuntu/CentOS  2014-10-10 (updated: 2016-05-22) 64246 153 本教程介绍 ...

  9. Nodejs学习笔记(二)——Eclipse中运行调试Nodejs

    前篇<Nodejs学习笔记(一)——初识Nodejs>主要介绍了在搭建node环境过程中遇到的小问题以及搭建Eclipse开发Node环境的前提步骤.本篇主要介绍如何在Eclipse中运行 ...

随机推荐

  1. memcached全面剖析

    memcached介绍如今,越来越多的Web应用程序开始使用memcached这个高速的缓存服务器软件.然而,memcached的基础知识远远未能像其他Web技术那样普及,memcached在国内的大 ...

  2. document.body.scrollTop vs document.documentElement.scrollTop

    window.addEventListener("scroll", function () { if (document.body.scrollTop >= window.i ...

  3. cf D. Renting Bikes

    http://codeforces.com/contest/363/problem/D 先对b和p排序,用二分求出可以租的车子的最大辆数,其中用mid以后的个人钱数去租前mid的价钱的车子. #inc ...

  4. LeetCode_Generate Parentheses

    Given n pairs of parentheses, write a function to generate all combinations of well-formed parenthes ...

  5. 对C标准中空白字符(空格、回车符(\r)、换行符(\n)、水平制表符(\t)、垂直制表符(\v)、换页符(\f))的理解

    版权声明:本文为博主原创文章,未经博主允许不得转载.   目录(?)[+]   C标准库里<ctype.h>中声明了一个函数: int isspace(int c); 该函数判断字符c是否 ...

  6. Linux中应用程序如何使用系统调用syscall

    最近在做Android,其中一个任务是写一个能在Linux命令行运行的测试AP,运行这个AP就能关闭设备电源,即Power Off. 在 Linux内核中已经找到了关闭电源的函数kernel_powe ...

  7. Hadoop开发遇到的问题之reduce卡住

    遇到的问题描述:在hadoop上面执行程序,程序运行之后能够正常执行.一切似乎都是正常的,然而过了一段时间之后程序便开始阻塞直到程序超时退出(如下). 14/08/19 21:17:51 INFO m ...

  8. python3-day3(内置函数)

    1.内置函数 1>print(bytearray('王',encoding='utf8')) 2>print(bytes('王',encoding='utf8')) 3>bool(' ...

  9. 设计: ListView 接口,and the missing read-only interfaces in java collection framework

    Java的集合框架以其成功易用的设计征服了很多人(包括我),并且教科书式的诠释了泛型的应用方式. 我也是被 Joshua Bloch 的书引领入门,从中得益良多.我当然不会认为自己在设计上比他懂得更多 ...

  10. JMeter简单性能测试(适合初学者)

    利用JMeter进行Web测试     JMeter介绍    脚本录制    运行JMeter进行测试    JMeter主要组件介绍    参数化设置    动态数据关联    使用命令行运行JM ...