使用Git下载Hadoop的到本地Eclipse开发环境
按照官网http://wiki.apache.org/hadoop/EclipseEnvironment指导,要把Hadoop下载到本地,并构建Eclipse开发环境,只需要三条指令:
- $ git clone git://git.apache.org/hadoop-common.git
- $ mvn install -DskipTests
- $ mvn eclipse:eclipse -DdownloadSources=true -DdownloadJavadocs=true
即可,但当我在本地执行完第二条后,报出如下错误日志信息:
- [INFO]
- [INFO] --- maven-antrun-plugin:1.6:run (compile-proto) @ hadoop-common ---
- [INFO] Executing tasks
- main:
- [exec] target/compile-proto.sh: line 17: protoc: command not found
- [exec] target/compile-proto.sh: line 17: protoc: command not found
- [INFO] ------------------------------------------------------------------------
- [INFO] Reactor Summary:
- [INFO]
- [INFO] Apache Hadoop Main ................................ SUCCESS [2.389s]
- [INFO] Apache Hadoop Project POM ......................... SUCCESS [0.698s]
- [INFO] Apache Hadoop Annotations ......................... SUCCESS [1.761s]
- [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [0.729s]
- [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.353s]
- [INFO] Apache Hadoop Auth ................................ SUCCESS [1.998s]
- [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [1.227s]
- [INFO] Apache Hadoop Common .............................. FAILURE [1.132s]
- [INFO] Apache Hadoop Common Project ...................... SKIPPED
- [INFO] Apache Hadoop HDFS ................................ SKIPPED
- [INFO] Apache Hadoop HttpFS .............................. SKIPPED
- [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
- [INFO] Apache Hadoop HDFS Project ........................ SKIPPED
- [INFO] hadoop-yarn ....................................... SKIPPED
- [INFO] hadoop-yarn-api ................................... SKIPPED
- [INFO] hadoop-yarn-common ................................ SKIPPED
- [INFO] hadoop-yarn-server ................................ SKIPPED
- [INFO] hadoop-yarn-server-common ......................... SKIPPED
- [INFO] hadoop-yarn-server-nodemanager .................... SKIPPED
- [INFO] hadoop-yarn-server-web-proxy ...................... SKIPPED
- [INFO] hadoop-yarn-server-resourcemanager ................ SKIPPED
- [INFO] hadoop-yarn-server-tests .......................... SKIPPED
- [INFO] hadoop-mapreduce-client ........................... SKIPPED
- [INFO] hadoop-mapreduce-client-core ...................... SKIPPED
- [INFO] hadoop-yarn-applications .......................... SKIPPED
- [INFO] hadoop-yarn-applications-distributedshell ......... SKIPPED
- [INFO] hadoop-yarn-site .................................. SKIPPED
- [INFO] hadoop-mapreduce-client-common .................... SKIPPED
- [INFO] hadoop-mapreduce-client-shuffle ................... SKIPPED
- [INFO] hadoop-mapreduce-client-app ....................... SKIPPED
- [INFO] hadoop-mapreduce-client-hs ........................ SKIPPED
- [INFO] hadoop-mapreduce-client-jobclient ................. SKIPPED
- [INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
- [INFO] hadoop-mapreduce .................................. SKIPPED
- [INFO] Apache Hadoop MapReduce Streaming ................. SKIPPED
- [INFO] Apache Hadoop Distributed Copy .................... SKIPPED
- [INFO] Apache Hadoop Archives ............................ SKIPPED
- [INFO] Apache Hadoop Rumen ............................... SKIPPED
- [INFO] Apache Hadoop Extras .............................. SKIPPED
- [INFO] Apache Hadoop Tools Dist .......................... SKIPPED
- [INFO] Apache Hadoop Tools ............................... SKIPPED
- [INFO] Apache Hadoop Distribution ........................ SKIPPED
- [INFO] ------------------------------------------------------------------------
- [INFO] BUILD FAILURE
- [INFO] ------------------------------------------------------------------------
- [INFO] Total time: 12.483s
- [INFO] Finished at: Mon Jan 30 22:57:23 GMT+08:00 2012
- [INFO] Final Memory: 24M/81M
- [INFO] ------------------------------------------------------------------------
- [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127 -> [Help 1]
- [ERROR]
- [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
- [ERROR] Re-run Maven using the -X switch to enable full debug logging.
- [ERROR]
- [ERROR] For more information about the errors and possible solutions, please read the following articles:
- [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
- [ERROR]
- [ERROR] After correcting the problems, you can resume the build with the command
- [ERROR] mvn <goals> -rf :hadoop-common
此问题我暂时还未分析具体原因和解决方案,暂时记录下。
展开分析
通过再此使用命令打出错误信息:
- $ mvn install -DskipTests -e
得到详细错误信息为:
- [INFO] ------------------------------------------------------------------------
- [INFO] BUILD FAILURE
- [INFO] ------------------------------------------------------------------------
- [INFO] Total time: 9.387s
- [INFO] Finished at: Mon Jan 30 23:11:07 GMT+08:00 2012
- [INFO] Final Memory: 19M/81M
- [INFO] ------------------------------------------------------------------------
- [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127 -> [Help 1]
- org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127
- at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)
- at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
- at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
- at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
- at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
- at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
- at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
- at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:319)
- at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
- at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
- at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
- at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
- at java.lang.reflect.Method.invoke(Method.java:597)
- at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
- at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
- at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
- at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
- Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: exec returned: 127
- at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:283)
- at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
- at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
- ... 19 more
- Caused by: /Users/apple/Documents/Hadoop-common-dev/hadoop-common/hadoop-common-project/hadoop-common/target/antrun/build-main.xml:23: exec returned: 127
- at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:650)
- at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:676)
- at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:502)
- at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
- at sun.reflect.GeneratedMethodAccessor16.invoke(Unknown Source)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
- at java.lang.reflect.Method.invoke(Method.java:597)
- at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
- at org.apache.tools.ant.Task.perform(Task.java:348)
- at org.apache.tools.ant.Target.execute(Target.java:390)
- at org.apache.tools.ant.Target.performTasks(Target.java:411)
- at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1397)
- at org.apache.tools.ant.Project.executeTarget(Project.java:1366)
- at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:270)
- ... 21 more
- [ERROR]
- [ERROR] Re-run Maven using the -X switch to enable full debug logging.
- [ERROR]
- [ERROR] For more information about the errors and possible solutions, please read the following articles:
- [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
- [ERROR]
通过上面错误信息,真方便找到解决方案。
根据上面的提示,访问https://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
得到如下提示信息:
Unlike many other errors, this exception is not generated by the Maven core itself but by a plugin. As a rule of thumb, plugins use this error to signal a problem in their configuration or the information they retrieved from the POM.
这里说的意思是:
这个错误不是Maven本身的错误,根据经验,可能是Maven使用的插件通过这个异常来标识它们没有从POM中获得相关的配置信息。
接下来进一步分析,通过Maven构建Hadoop过程中是否使用了插件。
从错误日志分析,编译过程使用了插件:maven-antrun-plugin。由于编译Hadoop-common过程中出错,所以进一步定位到hadoop-common工程下的POM.xml,可到看到下面信息:
- <plugin>
- <groupId>org.apache.maven.plugins</groupId>
- <artifactId>maven-antrun-plugin</artifactId>
- <executions>
- <execution>
- <id>compile-proto</id>
- <phase>generate-sources</phase>
- <goals>
- <goal>run</goal>
- </goals>
- <configuration>
- <target>
- <echo file="target/compile-proto.sh">
- PROTO_DIR=src/main/proto
- JAVA_DIR=target/generated-sources/java
- which cygpath 2> /dev/null
- if [ $? = 1 ]; then
- IS_WIN=false
- else
- IS_WIN=true
- WIN_PROTO_DIR=`cygpath --windows $PROTO_DIR`
- WIN_JAVA_DIR=`cygpath --windows $JAVA_DIR`
- fi
- mkdir -p $JAVA_DIR 2> /dev/null
- for PROTO_FILE in `ls $PROTO_DIR/*.proto 2> /dev/null`
- do
- if [ "$IS_WIN" = "true" ]; then
- protoc -I$WIN_PROTO_DIR --java_out=$WIN_JAVA_DIR $PROTO_FILE
- else
- protoc -I$PROTO_DIR --java_out=$JAVA_DIR $PROTO_FILE
- fi
- done
- </echo>
- <exec executable="sh" dir="${basedir}" failonerror="true">
- <arg line="target/compile-proto.sh"/>
- </exec>
- </target>
- </configuration>
- </execution>
- <execution>
- <id>compile-test-proto</id>
- <phase>generate-test-sources</phase>
- <goals>
- <goal>run</goal>
- </goals>
- <configuration>
- <target>
- <echo file="target/compile-test-proto.sh">
- PROTO_DIR=src/test/proto
- JAVA_DIR=target/generated-test-sources/java
- which cygpath 2> /dev/null
- if [ $? = 1 ]; then
- IS_WIN=false
- else
- IS_WIN=true
- WIN_PROTO_DIR=`cygpath --windows $PROTO_DIR`
- WIN_JAVA_DIR=`cygpath --windows $JAVA_DIR`
- fi
- mkdir -p $JAVA_DIR 2> /dev/null
- for PROTO_FILE in `ls $PROTO_DIR/*.proto 2> /dev/null`
- do
- if [ "$IS_WIN" = "true" ]; then
- protoc -I$WIN_PROTO_DIR --java_out=$WIN_JAVA_DIR $PROTO_FILE
- else
- protoc -I$PROTO_DIR --java_out=$JAVA_DIR $PROTO_FILE
- fi
- done
- </echo>
- <exec executable="sh" dir="${basedir}" failonerror="true">
- <arg line="target/compile-test-proto.sh"/>
- </exec>
- </target>
- </configuration>
- </execution>
- <execution>
- <id>save-version</id>
- <phase>generate-sources</phase>
- <goals>
- <goal>run</goal>
- </goals>
- <configuration>
- <target>
- <mkdir dir="${project.build.directory}/generated-sources/java"/>
- <exec executable="sh">
- <arg
- line="${basedir}/dev-support/saveVersion.sh ${project.version} ${project.build.directory}/generated-sources/java"/>
- </exec>
- </target>
- </configuration>
- </execution>
- <execution>
- <id>generate-test-sources</id>
- <phase>generate-test-sources</phase>
- <goals>
- <goal>run</goal>
- </goals>
- <configuration>
- <target>
- <mkdir dir="${project.build.directory}/generated-test-sources/java"/>
- <taskdef name="recordcc" classname="org.apache.hadoop.record.compiler.ant.RccTask">
- <classpath refid="maven.compile.classpath"/>
- </taskdef>
- <recordcc destdir="${project.build.directory}/generated-test-sources/java">
- <fileset dir="${basedir}/src/test/ddl" includes="**/*.jr"/>
- </recordcc>
- </target>
- </configuration>
- </execution>
- <execution>
- <id>create-log-dir</id>
- <phase>process-test-resources</phase>
- <goals>
- <goal>run</goal>
- </goals>
- <configuration>
- <target>
- <!--
- TODO: there are tests (TestLocalFileSystem#testCopy) that fail if data
- TODO: from a previous run is present
- -->
- <delete dir="${test.build.data}"/>
- <mkdir dir="${test.build.data}"/>
- <mkdir dir="${hadoop.log.dir}"/>
- <copy toDir="${project.build.directory}/test-classes">
- <fileset dir="${basedir}/src/main/conf"/>
- </copy>
- </target>
- </configuration>
- </execution>
- <execution>
- <phase>pre-site</phase>
- <goals>
- <goal>run</goal>
- </goals>
- <configuration>
- <tasks>
- <copy file="src/main/resources/core-default.xml" todir="src/site/resources"/>
- <copy file="src/main/xsl/configuration.xsl" todir="src/site/resources"/>
- </tasks>
- </configuration>
- </execution>
- </executions>
- </plugin>
上面是在Pom.xml文件中使用Maven下使用Ant插件,里面有一行:
- <echo file="target/compile-proto.sh">
看着让人不解,联系到HowToContribueToHadoop的文章http://wiki.apache.org/hadoop/HowToContribute可以推知,有可能由于本地没有安装ProtocolBuffers引起的,因为文章内部特别说明了:
接下来打算在本地重新安装ProtocolBuffers后再编译部署。
如预期,在本地安装好了Protoc Buffer后,后面两条指令顺利执行完整,剩下的就依据官网把目录下的工程导入Eclipse后,就可以在Eclipse下学习调试源码。
使用Git下载Hadoop的到本地Eclipse开发环境的更多相关文章
- 在 Ubuntu 上搭建 Hadoop 分布式集群 Eclipse 开发环境
一直在忙Android FrameWork,终于闲了一点,利用空余时间研究了一下Hadoop,并且在自己和同事的电脑上搭建了分布式集群,现在更新一下blog,分享自己的成果. 一 .环境 1.操作系统 ...
- git下载自己项目到本地
git下载自己项目到本地 假如外出工作,需要在另一台电脑上面pull自己的某个git远程项目到本地 $ git init $ git pull https://github.com/TTyb/54qj ...
- Hadoop集群 -Eclipse开发环境设置
1.Hadoop开发环境简介 1.1 Hadoop集群简介 Java版本:jdk-6u31-linux-i586.bin Linux系统:CentOS6.0 Hadoop版本:hadoop-1.0.0 ...
- 配置Hadoop的Eclipse开发环境
前言 在先前的文章中,已经介绍了如何在Ubuntu Kylin操作系统下搭建Hadoop运行环境,而现在将在之前工作的基础上搭建Eclipse开发环境. 配置 开发环境:Eclipse 4.2 其他同 ...
- 第三篇:配置Hadoop的Eclipse开发环境
前言 在先前的文章中,已经介绍了如何在Ubuntu Kylin操作系统下搭建Hadoop运行环境,而现在将在之前工作的基础上搭建Eclipse开发环境. 配置 开发环境:Eclipse 4.2,其他同 ...
- iDempiere 使用指南 windows下eclipse开发环境配置及打包下载
Created by 蓝色布鲁斯,QQ32876341,blog http://www.cnblogs.com/zzyan/ iDempiere官方中文wiki主页 http://wiki.idemp ...
- 配置hadoop-1.2.1 eclipse开发环境
写这篇文章的目的是记录解决配置过程中的问题 首先我们先看下这篇博文 配置hadoop-1.2.1 eclipse开发环境 但是在[修改 Hadoop 源码]这里,作者发布的 hadoop-core-1 ...
- docker-swarm建立本地集成开发环境
在k8s出现之后,docker-swarm使用的人越来越少,但在本地集成开发环境的搭建上,使用它还是比较轻量级的,它比docker-compose最大的好处就是容器之间的共享和服务的治理,你不需要li ...
- Hadoop学习笔记(4) ——搭建开发环境及编写Hello World
Hadoop学习笔记(4) ——搭建开发环境及编写Hello World 整个Hadoop是基于Java开发的,所以要开发Hadoop相应的程序就得用JAVA.在linux下开发JAVA还数eclip ...
随机推荐
- Codeforces Round #354 (Div. 2) B. Pyramid of Glasses 模拟
B. Pyramid of Glasses 题目连接: http://www.codeforces.com/contest/676/problem/B Description Mary has jus ...
- 使用命令行编译和运行 c、Java和python程序
集成开发环境已经非常方便,从编写程序到执行程序看到结果,让我们不用关心中间的过程.但是使用原始的.命令的方式来将程序编译运行有的时候可能有些用,比如写个简答的程序,或者是身边没有集成工具的时候. C语 ...
- python文本 字符与字符值转换
python文本 字符与字符值转换 场景: 将字符转换成ascii或者unicode编码 在转换过程中,注意使用ord和chr方法 >>> print(ord('a')) 97 ...
- 【elaseticsearch】elaseticsearch启动报错Caused by: org.elasticsearch.transport.BindTransportException: Failed to bind to [9300-9400]
elaseticsearch启动报错 [es1] uncaught exception in thread [main] org.elasticsearch.bootstrap.StartupExce ...
- DevExpress 利用DateEdit 仅显示和选择年份 z
DevExpress只提供了选择月份的控件MonthEdit,并没提供选择选择年份的控件,目测是官方偷懒不想弄,因为要实现的方法也很简单,利用ComboBoxEdit添加年份数据即可,直接封装一个控件 ...
- .net上开发winform
c++用WinForm做界面的实现 因为笔者是以前是做C#的,对Winform情有独钟,最近想转C++,想把以前的一些Delphi转成c++,MFC我不熟而且用起来相当烦效果又丑,GTK图形库用起来太 ...
- Selenium2+python自动化39-关于面试的题
前言 最近看到群里有小伙伴贴出一组面试题,最近又是跳槽黄金季节,小编忍不住抽出一点时间总结了下, 回答不妥的地方欢迎各位高手拍砖指点. 一.selenium中如何判断元素是否存在? 首先selen ...
- C++常用排序法、随机数
C++常用排序法研究 2008-12-25 14:38 首先介绍一个计算时间差的函数,它在<time.h>头文件中定义,于是我们只需这样定义2个变量,再相减就可以计算时间差了. 函数开头加 ...
- PHP自己定义安装
① 自己定义安装(先要在管理里停止apache服务,再卸载apache.再安装时不须要重新启动电脑) apache+php+mysql+phpmyadmin自行安装 我们建议大家,安装的时候安装到同一 ...
- WampServer和phpStorm的用法
WampServer的安装 修改默认的浏览器和文本编辑器 phpStore创建一个PHP工程 在phpStore中运行我们的项目 配制一个PHP Script运行环境 配制一个PHP Web Page ...