问题场景 
按照官网http://wiki.apache.org/hadoop/EclipseEnvironment指导,要把Hadoop下载到本地,并构建Eclipse开发环境,只需要三条指令:

  1. $ git clone git://git.apache.org/hadoop-common.git
  2. $ mvn install -DskipTests
  3. $ mvn eclipse:eclipse -DdownloadSources=true -DdownloadJavadocs=true

即可,但当我在本地执行完第二条后,报出如下错误日志信息:

  1. [INFO]
  2. [INFO] --- maven-antrun-plugin:1.6:run (compile-proto) @ hadoop-common ---
  3. [INFO] Executing tasks
  4. main:
  5. [exec] target/compile-proto.sh: line 17: protoc: command not found
  6. [exec] target/compile-proto.sh: line 17: protoc: command not found
  7. [INFO] ------------------------------------------------------------------------
  8. [INFO] Reactor Summary:
  9. [INFO]
  10. [INFO] Apache Hadoop Main ................................ SUCCESS [2.389s]
  11. [INFO] Apache Hadoop Project POM ......................... SUCCESS [0.698s]
  12. [INFO] Apache Hadoop Annotations ......................... SUCCESS [1.761s]
  13. [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [0.729s]
  14. [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.353s]
  15. [INFO] Apache Hadoop Auth ................................ SUCCESS [1.998s]
  16. [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [1.227s]
  17. [INFO] Apache Hadoop Common .............................. FAILURE [1.132s]
  18. [INFO] Apache Hadoop Common Project ...................... SKIPPED
  19. [INFO] Apache Hadoop HDFS ................................ SKIPPED
  20. [INFO] Apache Hadoop HttpFS .............................. SKIPPED
  21. [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
  22. [INFO] Apache Hadoop HDFS Project ........................ SKIPPED
  23. [INFO] hadoop-yarn ....................................... SKIPPED
  24. [INFO] hadoop-yarn-api ................................... SKIPPED
  25. [INFO] hadoop-yarn-common ................................ SKIPPED
  26. [INFO] hadoop-yarn-server ................................ SKIPPED
  27. [INFO] hadoop-yarn-server-common ......................... SKIPPED
  28. [INFO] hadoop-yarn-server-nodemanager .................... SKIPPED
  29. [INFO] hadoop-yarn-server-web-proxy ...................... SKIPPED
  30. [INFO] hadoop-yarn-server-resourcemanager ................ SKIPPED
  31. [INFO] hadoop-yarn-server-tests .......................... SKIPPED
  32. [INFO] hadoop-mapreduce-client ........................... SKIPPED
  33. [INFO] hadoop-mapreduce-client-core ...................... SKIPPED
  34. [INFO] hadoop-yarn-applications .......................... SKIPPED
  35. [INFO] hadoop-yarn-applications-distributedshell ......... SKIPPED
  36. [INFO] hadoop-yarn-site .................................. SKIPPED
  37. [INFO] hadoop-mapreduce-client-common .................... SKIPPED
  38. [INFO] hadoop-mapreduce-client-shuffle ................... SKIPPED
  39. [INFO] hadoop-mapreduce-client-app ....................... SKIPPED
  40. [INFO] hadoop-mapreduce-client-hs ........................ SKIPPED
  41. [INFO] hadoop-mapreduce-client-jobclient ................. SKIPPED
  42. [INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
  43. [INFO] hadoop-mapreduce .................................. SKIPPED
  44. [INFO] Apache Hadoop MapReduce Streaming ................. SKIPPED
  45. [INFO] Apache Hadoop Distributed Copy .................... SKIPPED
  46. [INFO] Apache Hadoop Archives ............................ SKIPPED
  47. [INFO] Apache Hadoop Rumen ............................... SKIPPED
  48. [INFO] Apache Hadoop Extras .............................. SKIPPED
  49. [INFO] Apache Hadoop Tools Dist .......................... SKIPPED
  50. [INFO] Apache Hadoop Tools ............................... SKIPPED
  51. [INFO] Apache Hadoop Distribution ........................ SKIPPED
  52. [INFO] ------------------------------------------------------------------------
  53. [INFO] BUILD FAILURE
  54. [INFO] ------------------------------------------------------------------------
  55. [INFO] Total time: 12.483s
  56. [INFO] Finished at: Mon Jan 30 22:57:23 GMT+08:00 2012
  57. [INFO] Final Memory: 24M/81M
  58. [INFO] ------------------------------------------------------------------------
  59. [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127 -> [Help 1]
  60. [ERROR]
  61. [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
  62. [ERROR] Re-run Maven using the -X switch to enable full debug logging.
  63. [ERROR]
  64. [ERROR] For more information about the errors and possible solutions, please read the following articles:
  65. [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
  66. [ERROR]
  67. [ERROR] After correcting the problems, you can resume the build with the command
  68. [ERROR]   mvn <goals> -rf :hadoop-common

此问题我暂时还未分析具体原因和解决方案,暂时记录下。 
展开分析 
通过再此使用命令打出错误信息:

  1. $ mvn install -DskipTests -e

得到详细错误信息为:

  1. [INFO] ------------------------------------------------------------------------
  2. [INFO] BUILD FAILURE
  3. [INFO] ------------------------------------------------------------------------
  4. [INFO] Total time: 9.387s
  5. [INFO] Finished at: Mon Jan 30 23:11:07 GMT+08:00 2012
  6. [INFO] Final Memory: 19M/81M
  7. [INFO] ------------------------------------------------------------------------
  8. [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127 -> [Help 1]
  9. org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127
  10. at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)
  11. at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
  12. at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
  13. at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
  14. at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
  15. at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
  16. at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
  17. at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:319)
  18. at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
  19. at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
  20. at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
  21. at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
  22. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  23. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
  24. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
  25. at java.lang.reflect.Method.invoke(Method.java:597)
  26. at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
  27. at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
  28. at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
  29. at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
  30. Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: exec returned: 127
  31. at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:283)
  32. at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
  33. at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
  34. ... 19 more
  35. Caused by: /Users/apple/Documents/Hadoop-common-dev/hadoop-common/hadoop-common-project/hadoop-common/target/antrun/build-main.xml:23: exec returned: 127
  36. at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:650)
  37. at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:676)
  38. at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:502)
  39. at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
  40. at sun.reflect.GeneratedMethodAccessor16.invoke(Unknown Source)
  41. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
  42. at java.lang.reflect.Method.invoke(Method.java:597)
  43. at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
  44. at org.apache.tools.ant.Task.perform(Task.java:348)
  45. at org.apache.tools.ant.Target.execute(Target.java:390)
  46. at org.apache.tools.ant.Target.performTasks(Target.java:411)
  47. at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1397)
  48. at org.apache.tools.ant.Project.executeTarget(Project.java:1366)
  49. at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:270)
  50. ... 21 more
  51. [ERROR]
  52. [ERROR] Re-run Maven using the -X switch to enable full debug logging.
  53. [ERROR]
  54. [ERROR] For more information about the errors and possible solutions, please read the following articles:
  55. [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
  56. [ERROR]

通过上面错误信息,真方便找到解决方案。

根据上面的提示,访问https://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException 
得到如下提示信息: 
        Unlike many other errors, this exception is not generated by the Maven core itself but by a plugin. As a rule of thumb, plugins use this error to signal a problem in their configuration or the information they retrieved from the POM. 
        这里说的意思是: 
        这个错误不是Maven本身的错误,根据经验,可能是Maven使用的插件通过这个异常来标识它们没有从POM中获得相关的配置信息。

接下来进一步分析,通过Maven构建Hadoop过程中是否使用了插件。 
        从错误日志分析,编译过程使用了插件:maven-antrun-plugin。由于编译Hadoop-common过程中出错,所以进一步定位到hadoop-common工程下的POM.xml,可到看到下面信息:

  1. <plugin>
  2. <groupId>org.apache.maven.plugins</groupId>
  3. <artifactId>maven-antrun-plugin</artifactId>
  4. <executions>
  5. <execution>
  6. <id>compile-proto</id>
  7. <phase>generate-sources</phase>
  8. <goals>
  9. <goal>run</goal>
  10. </goals>
  11. <configuration>
  12. <target>
  13. <echo file="target/compile-proto.sh">
  14. PROTO_DIR=src/main/proto
  15. JAVA_DIR=target/generated-sources/java
  16. which cygpath 2> /dev/null
  17. if [ $? = 1 ]; then
  18. IS_WIN=false
  19. else
  20. IS_WIN=true
  21. WIN_PROTO_DIR=`cygpath --windows $PROTO_DIR`
  22. WIN_JAVA_DIR=`cygpath --windows $JAVA_DIR`
  23. fi
  24. mkdir -p $JAVA_DIR 2> /dev/null
  25. for PROTO_FILE in `ls $PROTO_DIR/*.proto 2> /dev/null`
  26. do
  27. if [ "$IS_WIN" = "true" ]; then
  28. protoc -I$WIN_PROTO_DIR --java_out=$WIN_JAVA_DIR $PROTO_FILE
  29. else
  30. protoc -I$PROTO_DIR --java_out=$JAVA_DIR $PROTO_FILE
  31. fi
  32. done
  33. </echo>
  34. <exec executable="sh" dir="${basedir}" failonerror="true">
  35. <arg line="target/compile-proto.sh"/>
  36. </exec>
  37. </target>
  38. </configuration>
  39. </execution>
  40. <execution>
  41. <id>compile-test-proto</id>
  42. <phase>generate-test-sources</phase>
  43. <goals>
  44. <goal>run</goal>
  45. </goals>
  46. <configuration>
  47. <target>
  48. <echo file="target/compile-test-proto.sh">
  49. PROTO_DIR=src/test/proto
  50. JAVA_DIR=target/generated-test-sources/java
  51. which cygpath 2> /dev/null
  52. if [ $? = 1 ]; then
  53. IS_WIN=false
  54. else
  55. IS_WIN=true
  56. WIN_PROTO_DIR=`cygpath --windows $PROTO_DIR`
  57. WIN_JAVA_DIR=`cygpath --windows $JAVA_DIR`
  58. fi
  59. mkdir -p $JAVA_DIR 2> /dev/null
  60. for PROTO_FILE in `ls $PROTO_DIR/*.proto 2> /dev/null`
  61. do
  62. if [ "$IS_WIN" = "true" ]; then
  63. protoc -I$WIN_PROTO_DIR --java_out=$WIN_JAVA_DIR $PROTO_FILE
  64. else
  65. protoc -I$PROTO_DIR --java_out=$JAVA_DIR $PROTO_FILE
  66. fi
  67. done
  68. </echo>
  69. <exec executable="sh" dir="${basedir}" failonerror="true">
  70. <arg line="target/compile-test-proto.sh"/>
  71. </exec>
  72. </target>
  73. </configuration>
  74. </execution>
  75. <execution>
  76. <id>save-version</id>
  77. <phase>generate-sources</phase>
  78. <goals>
  79. <goal>run</goal>
  80. </goals>
  81. <configuration>
  82. <target>
  83. <mkdir dir="${project.build.directory}/generated-sources/java"/>
  84. <exec executable="sh">
  85. <arg
  86. line="${basedir}/dev-support/saveVersion.sh ${project.version} ${project.build.directory}/generated-sources/java"/>
  87. </exec>
  88. </target>
  89. </configuration>
  90. </execution>
  91. <execution>
  92. <id>generate-test-sources</id>
  93. <phase>generate-test-sources</phase>
  94. <goals>
  95. <goal>run</goal>
  96. </goals>
  97. <configuration>
  98. <target>
  99. <mkdir dir="${project.build.directory}/generated-test-sources/java"/>
  100. <taskdef name="recordcc" classname="org.apache.hadoop.record.compiler.ant.RccTask">
  101. <classpath refid="maven.compile.classpath"/>
  102. </taskdef>
  103. <recordcc destdir="${project.build.directory}/generated-test-sources/java">
  104. <fileset dir="${basedir}/src/test/ddl" includes="**/*.jr"/>
  105. </recordcc>
  106. </target>
  107. </configuration>
  108. </execution>
  109. <execution>
  110. <id>create-log-dir</id>
  111. <phase>process-test-resources</phase>
  112. <goals>
  113. <goal>run</goal>
  114. </goals>
  115. <configuration>
  116. <target>
  117. <!--
  118. TODO: there are tests (TestLocalFileSystem#testCopy) that fail if data
  119. TODO: from a previous run is present
  120. -->
  121. <delete dir="${test.build.data}"/>
  122. <mkdir dir="${test.build.data}"/>
  123. <mkdir dir="${hadoop.log.dir}"/>
  124. <copy toDir="${project.build.directory}/test-classes">
  125. <fileset dir="${basedir}/src/main/conf"/>
  126. </copy>
  127. </target>
  128. </configuration>
  129. </execution>
  130. <execution>
  131. <phase>pre-site</phase>
  132. <goals>
  133. <goal>run</goal>
  134. </goals>
  135. <configuration>
  136. <tasks>
  137. <copy file="src/main/resources/core-default.xml" todir="src/site/resources"/>
  138. <copy file="src/main/xsl/configuration.xsl" todir="src/site/resources"/>
  139. </tasks>
  140. </configuration>
  141. </execution>
  142. </executions>
  143. </plugin>

上面是在Pom.xml文件中使用Maven下使用Ant插件,里面有一行:

  1. <echo file="target/compile-proto.sh">

看着让人不解,联系到HowToContribueToHadoop的文章http://wiki.apache.org/hadoop/HowToContribute可以推知,有可能由于本地没有安装ProtocolBuffers引起的,因为文章内部特别说明了:

引用
Hadoop 0.23+ must have Google's ProtocolBuffers for compilation to work.

接下来打算在本地重新安装ProtocolBuffers后再编译部署。

如预期,在本地安装好了Protoc Buffer后,后面两条指令顺利执行完整,剩下的就依据官网把目录下的工程导入Eclipse后,就可以在Eclipse下学习调试源码。

使用Git下载Hadoop的到本地Eclipse开发环境的更多相关文章

  1. 在 Ubuntu 上搭建 Hadoop 分布式集群 Eclipse 开发环境

    一直在忙Android FrameWork,终于闲了一点,利用空余时间研究了一下Hadoop,并且在自己和同事的电脑上搭建了分布式集群,现在更新一下blog,分享自己的成果. 一 .环境 1.操作系统 ...

  2. git下载自己项目到本地

    git下载自己项目到本地 假如外出工作,需要在另一台电脑上面pull自己的某个git远程项目到本地 $ git init $ git pull https://github.com/TTyb/54qj ...

  3. Hadoop集群 -Eclipse开发环境设置

    1.Hadoop开发环境简介 1.1 Hadoop集群简介 Java版本:jdk-6u31-linux-i586.bin Linux系统:CentOS6.0 Hadoop版本:hadoop-1.0.0 ...

  4. 配置Hadoop的Eclipse开发环境

    前言 在先前的文章中,已经介绍了如何在Ubuntu Kylin操作系统下搭建Hadoop运行环境,而现在将在之前工作的基础上搭建Eclipse开发环境. 配置 开发环境:Eclipse 4.2 其他同 ...

  5. 第三篇:配置Hadoop的Eclipse开发环境

    前言 在先前的文章中,已经介绍了如何在Ubuntu Kylin操作系统下搭建Hadoop运行环境,而现在将在之前工作的基础上搭建Eclipse开发环境. 配置 开发环境:Eclipse 4.2,其他同 ...

  6. iDempiere 使用指南 windows下eclipse开发环境配置及打包下载

    Created by 蓝色布鲁斯,QQ32876341,blog http://www.cnblogs.com/zzyan/ iDempiere官方中文wiki主页 http://wiki.idemp ...

  7. 配置hadoop-1.2.1 eclipse开发环境

    写这篇文章的目的是记录解决配置过程中的问题 首先我们先看下这篇博文 配置hadoop-1.2.1 eclipse开发环境 但是在[修改 Hadoop 源码]这里,作者发布的 hadoop-core-1 ...

  8. docker-swarm建立本地集成开发环境

    在k8s出现之后,docker-swarm使用的人越来越少,但在本地集成开发环境的搭建上,使用它还是比较轻量级的,它比docker-compose最大的好处就是容器之间的共享和服务的治理,你不需要li ...

  9. Hadoop学习笔记(4) ——搭建开发环境及编写Hello World

    Hadoop学习笔记(4) ——搭建开发环境及编写Hello World 整个Hadoop是基于Java开发的,所以要开发Hadoop相应的程序就得用JAVA.在linux下开发JAVA还数eclip ...

随机推荐

  1. VC 调用 Python

    //file:py.h BOOL InitPython(); BOOL ClosePython(); ======================== //file:py.cpp #include & ...

  2. hdu 4452 Running Rabbits 模拟

    Running RabbitsTime Limit: 2000/1000 MS (Java/Others)    Memory Limit: 32768/32768 K (Java/Others)To ...

  3. Codeforces Round #288 (Div. 2) B. Anton and currency you all know 贪心

    B. Anton and currency you all know time limit per test 0.5 seconds memory limit per test 256 megabyt ...

  4. Poj 题目分类

    初期:一.基本算法:     (1)枚举. (poj1753,poj2965)     (2)贪心(poj1328,poj2109,poj2586)     (3)递归和分治法.     (4)递推. ...

  5. interfacer和abstarct class的异同

  6. no device found for connection ‘ System eth0′

    解决办法: 1.删除/etc/udev/rules.d/70-persistent-net.rules文件,重启系统. 2.如果上面的不起作用,那么去看ifcfg-eth0文件中的HWADDR是否正确 ...

  7. POJ 1330 Nearest Common Ancestors (LCA,倍增算法,在线算法)

    /* *********************************************** Author :kuangbin Created Time :2013-9-5 9:45:17 F ...

  8. JTAG接线描述

    http://www.dzsc.com/data/html/2008-12-23/75397.html JTAG测试信号由下面5个信号组成. TRST:测试复位输入信号,测试接口初始化 TCK:测试时 ...

  9. Delphi 调用SQL Server 2008存储过程

    1.表结构如下(预算数据明细表): CREATE TABLE [dbo].[BA_FeeDetail]( [ID] [int] IDENTITY(1,1) NOT NULL, [FeeDeptID] ...

  10. C#高性能大容量SOCKET并发(转)

    C#高性能大容量SOCKET并发(零):代码结构说明 C#高性能大容量SOCKET并发(一):IOCP完成端口例子介绍 C#高性能大容量SOCKET并发(二):SocketAsyncEventArgs ...