windows下eclipse远程连接hadoop集群开发mapreduce
.png)


.png)

.png)
.png)

.png)



.png)
.png)
liang ni hao ma
wo hen hao
ha
qwe
asasa
xcxc vbv xxxx aaa eee
package com.hadoop.example1;
import java.io.IOException;
import java.util.Iterator;
import java.util.StringTokenizer;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapred.FileInputFormat;
import org.apache.hadoop.mapred.FileOutputFormat;
import org.apache.hadoop.mapred.JobClient;
import org.apache.hadoop.mapred.JobConf;
import org.apache.hadoop.mapred.MapReduceBase;
import org.apache.hadoop.mapred.Mapper;
import org.apache.hadoop.mapred.OutputCollector;
import org.apache.hadoop.mapred.Reducer;
import org.apache.hadoop.mapred.Reporter;
import org.apache.hadoop.mapred.TextInputFormat;
import org.apache.hadoop.mapred.TextOutputFormat;
public class WordCount {
public static class Map extends MapReduceBase implements
Mapper<LongWritable, Text, Text, IntWritable> {
private final static IntWritable one = new IntWritable(1);
private Text word = new Text();
public void map(LongWritable key, Text value,
OutputCollector<Text, IntWritable> output, Reporter reporter)
throws IOException {
String line = value.toString();
StringTokenizer tokenizer = new StringTokenizer(line);
while (tokenizer.hasMoreTokens()) {
word.set(tokenizer.nextToken());
output.collect(word, one);
}
}
}
public static class Reduce extends MapReduceBase implements
Reducer<Text, IntWritable, Text, IntWritable> {
public void reduce(Text key, Iterator<IntWritable> values,
OutputCollector<Text, IntWritable> output, Reporter reporter)
throws IOException {
int sum = 0;
while (values.hasNext()) {
sum += values.next().get();
}
output.collect(key, new IntWritable(sum));
}
}
public static void main(String[] args) throws Exception {
JobConf conf = new JobConf(WordCount.class);
conf.setJobName("wordcount");
conf.setOutputKeyClass(Text.class);
conf.setOutputValueClass(IntWritable.class);
conf.setMapperClass(Map.class);
conf.setCombinerClass(Reduce.class);
conf.setReducerClass(Reduce.class);
conf.setInputFormat(TextInputFormat.class);
conf.setOutputFormat(TextOutputFormat.class);
FileInputFormat.setInputPaths(conf, new Path(args[0]));
FileOutputFormat.setOutputPath(conf, new Path(args[1]));
JobClient.runJob(conf);
}
}
.png)
.png)

14/05/29 13:49:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/05/29 13:49:16 ERROR security.UserGroupInformation: PriviledgedActionException as:ISCAS cause:java.io.IOException: Failed to set permissions of path: \tmp\hadoop-ISCAS\mapred\staging\ISCAS1655603947\.staging to 0700
Exception in thread "main" java.io.IOException: Failed to set permissions of path: \tmp\hadoop-ISCAS\mapred\staging\ISCAS1655603947\.staging to 0700
at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:691)
at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:664)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:514)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:349)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:193)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:126)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:942)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)
at org.apache.hadoop.examples.WordCount.main(WordCount.java:82)
private static void checkReturnValue(boolean rv, File p,
FsPermission permission)
throws IOException
{
/**
* comment the following, disable this function if (!rv)
{
throw new IOException("Failed to set permissions of path: " + p +
" to " +
String.format("%04o", permission.toShort()));
}
*/
}
%60LV47(~(BZWGV.png)

windows下eclipse远程连接hadoop集群开发mapreduce的更多相关文章
- windows下在eclipse上远程连接hadoop集群调试mapreduce错误记录
第一次跑mapreduce,记录遇到的几个问题,hadoop集群是CDH版本的,但我windows本地的jar包是直接用hadoop2.6.0的版本,并没有特意找CDH版本的 1.Exception ...
- windows下eclipse远程连接hadoop错误“Exception in thread"main"java.io.IOException: Call to Master.Hadoop/172.20.145.22:9000 failed ”
在VMware虚拟机下搭建了hadoop集群,ubuntu-12.04,一台master,三台slave.hadoop-0.20.2版本.在 master机器上利用eclipse-3.3连接hadoo ...
- Eclipse远程提交hadoop集群任务
文章概览: 1.前言 2.Eclipse查看远程hadoop集群文件 3.Eclipse提交远程hadoop集群任务 4.小结 1 前言 Hadoop高可用品台搭建完备后,参见<Hadoop ...
- Win7下通过eclipse远程连接CDH集群来执行相应的程序以及错误说明
最近尝试这用用eclipse连接CDH的集群,由于之前尝试过很多次都没连上,有一次发现Cloudera Manager是将连接的端口修改了,所以才导致连接不上CDH的集群,之前Apache hadoo ...
- windows下Eclipse远程连接linux hadoop远程调试 经验(一)
环境 Windows 7 64bit JDK 1.6.0_45 (i586) JDK 1.7.0_51 (i586) Eclipse Kepler Eclipse -plugin-1.2.1.ja ...
- Eclipse/MyEclipse连接Hadoop集群出现:Unable to ... ... org.apache.hadoop.security.AccessControlExceptiom:Permission denied问题
问题详细如下: 解决办法: <property> <name>dfs.premissions</name> <value>false</value ...
- eclipse连接远程hadoop集群开发时权限不足问题解决方案
转自:http://blog.csdn.net/shan9liang/article/details/9734693 eclipse连接远程hadoop集群开发时报错 Exception in t ...
- eclipse连接远程hadoop集群开发时0700问题解决方案
eclipse连接远程hadoop集群开发时报错 错误信息: Exception in thread "main" java.io.IOException:Failed to se ...
- 【hadoop】——window下elicpse连接hadoop集群基础超详细版
1.Hadoop开发环境简介 1.1 Hadoop集群简介 Java版本:jdk-6u31-linux-i586.bin Linux系统:CentOS6.0 Hadoop版本:hadoop-1.0.0 ...
随机推荐
- C字符串翻转
实现字符串翻转,思路很简单,就是首尾字符对调. void reverse(char* str){ char* p = str + strlen(str) - 1;//最后一个字符地址 char tem ...
- 发RTX通知
安装sdk 在RTXServer目录下找到WebRoot目录,找到里面的SendNotify.cgi(就是一个php页面,默认是pc - ascii编码).打开页面,在头部加上编码信息 header( ...
- jenkins基础知识
修改默认端口号启动: java -jar jenkins.war --ajp13Port=-1 --httpPort=8089 一些基本的命令: http://[jenkins-server]/[co ...
- appium-unittest框架中的断言
1.首先unittest本身是一个python的测试框架,他有他自己的使用规则: 2.如果用其中的方法,需要引入,方法: import unittest class Login(unittest.Te ...
- Windows下自由创建.htaccess文件的N种方法
.htaccess是apache的访问控制文件,apache中httpd.conf的选项配合此文件,完美实现了目录.站点的访问控 制,当然最多的还是rewrite功能,即URL重写,PHP中实现伪静态 ...
- Datapump tips
同时导出多个schema下的表 $ expdp system/manager dumpfile=test.dmp logfile=test.log directory=TESTDIR schemas= ...
- vsftpd 被动模式与主动模式
vsftpd 被动模式与主动模式 VSFTP文件与目录/usr/sbin/vsftp vsftp的主程序/etc/rc.d/init.d/vsftp vsftp的启动脚本/etc/vsftpd/vsf ...
- 11-24网页基础--Js基础语法
1.运算符 比较运算符(7种):==/===/!=/>/</<=/>= ===(全等于) 2.字符串substring的用法 3.练习题:累加求和(运用Js的方法) 4.进制转 ...
- 0016_练习题d2
__author__ = 'qq593' #!/usr/bin/env python #-*- coding:utf-8 -*- #元素分类,有如下值集合[11,22,33,44,55,66,77,8 ...
- Linux下修改Mysql最大并发连接数
输入的命令如下: /usr/local/mysql/bin/mysqladmin -uroot -pyyyyyy variables |grep max_connections nano /etc/m ...