【Linux下Hadoop-eclipse-plus-3.2.0】编译Hadoop连接eclipse的插件遇见的一系列错误,崩溃的操作
2019-09-02 23:35:22
前言:首先,我想吐槽下自己,居然花费了4到5个夜晚和中午的时间来做这件事情,直到刚才才顺利解决,我也挺佩服自己的!
我在这个过程中参考其他人的博客,非常感谢他们,写博客是个很好的习惯,警戒自己也帮助他人!
正文:
1、你需要下载3种必备品:Hadoop、eclipse、Hadoop-eclipse-plus
2、Hadoop、eclipse都是直接从官网下载,如果说从官网下载慢的话,可以用迅雷下载,这样快点。Hadoop-eclipse-plus需要从GitHub上搜索进行下载,最好下载Hadoop3.x-eclipse-plus
3.安装好Hadoop、eclipse后,进入本片主要讲解操作,编译Hadoop-eclipse-plus-3.x.x。因为我都是下载最新的版本,所以Hadoop用的3.2.0版本,故在本文中我自己编译的是Hadoop-eclipse-plus-3.2.0版本。
4.开始第一步解压Hadoop3.x-eclipse-plus压缩包到/home/hadoop/hadoop-3.2.0下,会形成eclipse-hadoop3x-master目录,接下来我们所有的操作都是在这个目录下进行的。
5.修改该目录下/src/contrib/eclipse-plugin中的build.xml文件,代码中出现add的地方就是我修改的地方
<!----------------------- part one---------------------------------------> <!-- Override classpath to include Eclipse SDK jars -->
<path id="classpath">
<pathelement location="${build.classes}"/>
<!--pathelement location="${hadoop.root}/build/classes"/-->
<path refid="eclipse-sdk-jars"/>
<path refid="hadoop-sdk-jars"/> <!-- new add start-->
<fileset dir="${hadoop.root}">
<include name="**/*.jar" />
</fileset>
<!-- new add end--> <!----------------------- part two--------------------------------------->
<!--<target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">-->
<echo message="contrib: ${name}"/>
<javac
encoding="${build.encoding}"
srcdir="${src.dir}"
includes="**/*.java"
destdir="${build.classes}"
debug="${javac.debug}"
deprecation="${javac.deprecation}"
includeantruntime="on" <!-- new add --> >
<classpath refid="classpath"/>
</javac>
</target> <!----------------------- part three---------------------------------------
注:这部分需要根据自己的需要添加,并且需要和已安装的Hadoop里面的包版本对应一致,且要在libraries.properties中对的上>
<copy file="${hadoop.home}/share/hadoop/common/lib/protobuf-java-${protobuf.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/log4j-${log4j.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-configuration2-${commons-configuration2.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-lang3-${commons-lang3.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/jackson-core-asl-${jackson.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/jackson-mapper-asl-${jackson.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-log4j12-${slf4j-log4j12.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-api-${slf4j-api.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/guava-${guava.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/hadoop-auth-${hadoop.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/netty-${netty.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/javax.servlet-api-${servlet-api.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-io-${commons-io.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/htrace-core4-${htrace.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/woodstox-core-5.0.3.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/stax2-api-3.1.4.jar" todir="${build.dir}/lib" verbose="true"/> <jar
jarfile="${build.dir}/hadoop-${name}-${hadoop.version}.jar"
manifest="${root}/META-INF/MANIFEST.MF">
<manifest>
<attribute name="Bundle-ClassPath"
value="classes/,
lib/hadoop-mapreduce-client-core-${hadoop.version}.jar,
lib/hadoop-mapreduce-client-common-${hadoop.version}.jar,
lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar,
lib/hadoop-auth-${hadoop.version}.jar,
lib/hadoop-common-${hadoop.version}.jar,
lib/hadoop-hdfs-${hadoop.version}.jar,
lib/protobuf-java-${protobuf.version}.jar,
lib/log4j-${log4j.version}.jar,
lib/commons-cli-${commons-cli.version}.jar,
lib/commons-configuration2-${commons-configuration2.version}.jar,
lib/commons-httpclient-${commons-httpclient.version}.jar,
lib/commons-lang3-${commons-lang3.version}.jar,
lib/commons-collections-${commons-collections.version}.jar,
lib/jackson-core-asl-${jackson.version}.jar,
lib/jackson-mapper-asl-${jackson.version}.jar,
lib/slf4j-log4j12-${slf4j-log4j12.version}.jar,
lib/slf4j-api-${slf4j-api.version}.jar,
lib/guava-${guava.version}.jar,
lib/hadoop-auth-${hadoop.version}.jar,
lib/netty-${netty.version}.jar,
lib/javax.servlet-api-${servlet-api.version}.jar,
lib/commons-io-${commons-io.version}.jar,
lib/woodstox-core-5.0.3.jar,
lib/stax2-api-3.1.4.jar"/>
这里需要注意几个地方:
1.第三部分中加入的woodstox-core-5.0.3.jar和stax2-api-3.1.4.jar是为了解决“安装完成后启动Eclipse,点击Add New Location没有反应”的问题
2.第三部分加入的lib下的包必须和自己安装Hadoop里面的版本一致,这里需要自己一个个去核对,具体在下一步操作中就会遇见,非常费时的一个操作
3.上面代码中忘记加了,将depends="init, ivy-retrieve-common"去掉,我看别人是这样解释的“执行上面的命令的时候,会查找相应的jar包,若没有的会一直停在那里,但实际上编译Hadoop-eclipse-plugin并不需要common相关的包.”
<target name="compile" unless="skip.contrib">
<!--<target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">-->
4.另外要注意build.xlm中涉及到Hadoop安装目录下/share/hadoop/common/lib/路径时,要观察与Hadoop的路径是否一致,如果不是要修改成一样的,不然无法正常引用jar包。
6、修改该目录下/ivy中的libraries.properties文件,要求和Hadoop安装目录下share/hadoop中lib里面jar包的版本能对的上就行,遇见找不到的就不修改
以下是我修改后的代码:
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License. #This properties file lists the versions of the various artifacts used by hadoop and components.
#It drives ivy and the generation of a maven POM # This is the version of hadoop we are generating
hadoop.version=3.2.0
hadoop-gpl-compression.version=0.1.0 #These are the versions of our dependencies (in alphabetical order)
apacheant.version=1.7.0
ant-task.version=2.0.10 asm.version=5.0.4
aspectj.version=1.6.5
aspectj.version=1.6.11 checkstyle.version=4.2 commons-cli.version=1.2
commons-codec.version=1.11
commons-collections.version=3.2.2
commons-configuration2.version=2.1.1
commons-daemon.version=1.0.13
commons-httpclient.version=3.0.1
commons-lang3.version=3.7
commons-logging.version=1.1.13
commons-logging-api.version=1.1.13
commons-math.version=3.1.1
commons-el.version=1.0
commons-fileupload.version=1.2
commons-io.version=2.5
commons-net.version=3.6
core.version=3.1.1
coreplugin.version=1.3.2 hsqldb.version=2.3.4 ivy.version=2.1.0 jasper.version=5.5.12
jackson.version=1.9.13
#not able to figureout the version of jsp & jsp-api version to get it resolved throught ivy
# but still declared here as we are going to have a local copy from the lib folder
jsp.version=2.1
jsp-api.version=5.5.12
jsp-api-2.1.version=6.1.14
jsp-2.1.version=6.1.14
jets3t.version=0.6.1
jetty.version=6.1.26
jetty-util.version=6.1.26
jersey-core.version=1.19
jersey-json.version=1.19
jersey-server.version=1.19
junit.version=4.11
jdeb.version=0.8
jdiff.version=1.0.9
json.version=1.0
protobuf.version=2.5.0 kfs.version=0.1
netty.version=3.10.5.Final
log4j.version=1.2.17
lucene-core.version=2.3.1
htrace.version=4.1.0-incubating
mockito-all.version=1.8.5
jsch.version=0.1.45 oro.version=2.0.8 rats-lib.version=0.5.1 servlet.version=4.0.6
servlet-api.version=3.1.0
slf4j-api.version=1.7.25
slf4j-log4j12.version=1.7.25
guava.version=11.0.2 wagon-http.version=1.0-beta-2
xmlenc.version=0.52
xerces.version=1.4.4
7.执行编译指令,具体可参考下载下来插件解压后的eclipse-hadoop3x-master文件下的README.md参考说明书,代码如下
[hadoop@hadoop01 ~]$ cd /home/hadoop/hadoop-3.2.0/eclipse-hadoop3x-master/src/contrib/eclipse-plugin
--要进入root权限进行编译,不然会报错
[hadoop@hadoop01 eclipse-plugin]$ su root
Password:
[root@hadoop01 eclipse-plugin]# ant jar -Dversion=3.2.0 -Dhadoop.version=3.2.0 -Declipse.home=/home/hadoop/eclipse -Dhadoop.home=/home/hadoop/hadoop-3.2.0
Buildfile: /home/hadoop/hadoop-3.2.0/eclipse-hadoop3x-master/src/contrib/eclipse-plugin/build.xml compile:
[echo] contrib: eclipse-plugin jar:
[jar] Building jar: /home/hadoop/hadoop-3.2.0/eclipse-hadoop3x-master/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-3.2.0.jar BUILD SUCCESSFUL
Total time: 5 seconds
注意几点:一是先将README.md读懂,知道如何去编译Hadoop-eclipse-plus插件;而是需要使用root权限,不然会报错,无法编译成功,昨天其实我应该设置好所有的参数了,就是没有使用root权限,所以没有编译成功。
吐槽下:其实自己编译成功,结果eclipse的版本(我下载的是c++)下载错误,导致在eclipse中一直无法出现Hadoop Map/Reduce选项卡,要下载eclipse-jee-linux-gtk-x86_64.tar.gz版本。
8、将hadoop-eclipse-plugin-2.x.x.jar放到eclipse的plugins目录下,启动eclipse
9、打开window===>prefernces, 出现Hadoop Map/Reduce选项卡就说明编译成功了
10、但是到上步就完事大吉了,你还需要设置Hadoop Map/Reduce的里面Hadoop的安装地址,然后再进行后续操作才能确保编译正确,不然要返回去修改参数,重新编译。
【new error】好吧,发表这篇文章之后,这几天我又陆续在整这个插件问题,因为我以为可以用了,实际也可以用了,但是在DFS location下的文件夹一直点不开,最开始报eclipse org/apache/htrace/core/Tracer$Builder这个错误,中间折腾了很久,不知道怎么在弄,最后报No FileSystem for scheme: hdfs这个错误,我一直以为是我编译的插件有问题,其实不然。
然后在网上各种查找,终于在"https://blog.csdn.net/junior19/article/details/79594192"这篇文章中按照它的步骤找到了解决方法,反正现在是能看见文件夹了。哎。。。。。。。。。。。。
操作是在edit hadoop location 里面的advanced parameter的参数设置里面找到hadoop.tmp.dir将其内容填成与你之前配置hadoop文件的core-site.xml
的要一致
其他出现的问题,请参考下面的地址,这都是我在编译过程中参考的文章,再次感谢他们!
【文章1】https://www.cnblogs.com/ljy2013/articles/4417933.html
【文章2】https://www.cnblogs.com/zhangchao0515/p/7099002.html
【文章3】https://www.cnblogs.com/PurpleDream/p/4014751.html
【文章4】https://www.cnblogs.com/sissie-coding/p/9449941.html
附build.xml全代码:
<?xml version="1.0" encoding="UTF-8" standalone="no"?> <!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
--> <project default="jar" name="eclipse-plugin"> <import file="../build-contrib.xml"/> <path id="eclipse-sdk-jars"> <fileset dir="${eclipse.home}/plugins/">
<include name="org.eclipse.ui*.jar"/>
<include name="org.eclipse.jdt*.jar"/>
<include name="org.eclipse.core*.jar"/>
<include name="org.eclipse.equinox*.jar"/>
<include name="org.eclipse.debug*.jar"/>
<include name="org.eclipse.osgi*.jar"/>
<include name="org.eclipse.swt*.jar"/>
<include name="org.eclipse.jface*.jar"/>
<include name="org.eclipse.team.cvs.ssh2*.jar"/>
<include name="com.jcraft.jsch*.jar"/>
</fileset>
</path> <path id="hadoop-sdk-jars">
<fileset dir="${hadoop.home}/share/hadoop/mapreduce">
<include name="hadoop*.jar"/>
</fileset>
<fileset dir="${hadoop.home}/share/hadoop/hdfs">
<include name="hadoop*.jar"/>
</fileset>
<fileset dir="${hadoop.home}/share/hadoop/common">
<include name="hadoop*.jar"/>
</fileset>
</path> <!-- Override classpath to include Eclipse SDK jars -->
<path id="classpath">
<pathelement location="${build.classes}"/>
<!--pathelement location="${hadoop.root}/build/classes"/-->
<path refid="eclipse-sdk-jars"/>
<path refid="hadoop-sdk-jars"/> <!-- new add -->
<fileset dir="${hadoop.root}">
<include name="**/*.jar" />
</fileset> </path> <!-- Skip building if eclipse.home is unset. -->
<target name="check-contrib" unless="eclipse.home">
<property name="skip.contrib" value="yes"/>
<echo message="eclipse.home unset: skipping eclipse plugin"/>
</target> <target name="compile" unless="skip.contrib">
<!--<target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">-->
<echo message="contrib: ${name}"/>
<javac
encoding="${build.encoding}"
srcdir="${src.dir}"
includes="**/*.java"
destdir="${build.classes}"
debug="${javac.debug}"
deprecation="${javac.deprecation}"
includeantruntime="on">
<classpath refid="classpath"/>
</javac>
</target> <!-- Override jar target to specify manifest -->
<target name="jar" depends="compile" unless="skip.contrib">
<mkdir dir="${build.dir}/lib"/>
<copy todir="${build.dir}/lib/" verbose="true">
<fileset dir="${hadoop.home}/share/hadoop/mapreduce">
<include name="hadoop*.jar"/>
</fileset>
</copy>
<copy todir="${build.dir}/lib/" verbose="true">
<fileset dir="${hadoop.home}/share/hadoop/common">
<include name="hadoop*.jar"/>
</fileset>
</copy>
<copy todir="${build.dir}/lib/" verbose="true">
<fileset dir="${hadoop.home}/share/hadoop/hdfs">
<include name="hadoop*.jar"/>
</fileset>
</copy>
<copy todir="${build.dir}/lib/" verbose="true">
<fileset dir="${hadoop.home}/share/hadoop/yarn">
<include name="hadoop*.jar"/>
</fileset>
</copy> <copy todir="${build.dir}/classes" verbose="true">
<fileset dir="${root}/src/java">
<include name="*.xml"/>
</fileset>
</copy> <copy file="${hadoop.home}/share/hadoop/common/lib/protobuf-java-${protobuf.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/log4j-${log4j.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-configuration2-${commons-configuration2.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-lang3-${commons-lang3.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/jackson-core-asl-${jackson.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/jackson-mapper-asl-${jackson.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-log4j12-${slf4j-log4j12.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-api-${slf4j-api.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/guava-${guava.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/hadoop-auth-${hadoop.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/netty-${netty.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/javax.servlet-api-${servlet-api.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-io-${commons-io.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/htrace-core4-${htrace.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/woodstox-core-5.0.3.jar" todir="${build.dir}/lib" verbose="true"/>
<copy file="${hadoop.home}/share/hadoop/common/lib/stax2-api-3.1.4.jar" todir="${build.dir}/lib" verbose="true"/> <jar
jarfile="${build.dir}/hadoop-${name}-${hadoop.version}.jar"
manifest="${root}/META-INF/MANIFEST.MF">
<manifest>
<attribute name="Bundle-ClassPath"
value="classes/,
lib/hadoop-mapreduce-client-core-${hadoop.version}.jar,
lib/hadoop-mapreduce-client-common-${hadoop.version}.jar,
lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar,
lib/hadoop-auth-${hadoop.version}.jar,
lib/hadoop-common-${hadoop.version}.jar,
lib/hadoop-hdfs-${hadoop.version}.jar,
lib/protobuf-java-${protobuf.version}.jar,
lib/log4j-${log4j.version}.jar,
lib/commons-cli-${commons-cli.version}.jar,
lib/commons-configuration2-${commons-configuration2.version}.jar,
lib/commons-httpclient-${commons-httpclient.version}.jar,
lib/commons-lang3-${commons-lang3.version}.jar,
lib/commons-collections-${commons-collections.version}.jar,
lib/jackson-core-asl-${jackson.version}.jar,
lib/jackson-mapper-asl-${jackson.version}.jar,
lib/slf4j-log4j12-${slf4j-log4j12.version}.jar,
lib/slf4j-api-${slf4j-api.version}.jar,
lib/guava-${guava.version}.jar,
lib/hadoop-auth-${hadoop.version}.jar,
lib/netty-${netty.version}.jar,
lib/javax.servlet-api-${servlet-api.version}.jar,
lib/commons-io-${commons-io.version}.jar,
lib/woodstox-core-5.0.3.jar,
lib/stax2-api-3.1.4.jar"/>
</manifest>
<fileset dir="${build.dir}" includes="classes/ lib/"/>
<!--fileset dir="${build.dir}" includes="*.xml"/-->
<fileset dir="${root}" includes="resources/ plugin.xml"/>
</jar>
</target> </project>
附README.md
hadoop3x-eclipse-plugin
======================= eclipse plugin for hadoop 3.x.x How to build
---------------------------------------- [hdpusr@demo hadoop2x-eclipse-plugin]$ cd src/contrib/eclipse-plugin # Assume hadoop installation directory is /usr/share/hadoop [hdpusr@apclt eclipse-plugin]$ ant jar -Dversion=3.1.0 -Dhadoop.version=3.1.0 -Declipse.home=/opt/eclipse -Dhadoop.home=/usr/share/hadoop final jar will be generated at directory ${hadoop2x-eclipse-plugin}/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-2.4.1.jar release version included
------------------------------------- release/hadoop-eclipse-kepler-plugin-2.4.1.jar # not tested yet release/hadoop-eclipse-kepler-plugin-2.2.0.jar options required
--------------------------------------
version: plugin version hadoop.version: hadoop version you want to compiled with eclipse.home: path of eclipse home hadoop.home: path of hadoop 3.x home How to debug
--------------------------------------
start eclipse with debug parameter: /opt/eclipse/eclipse -clean -consolelog -debug Note: compile issues resolve:
--------------------------------------
1. For different hadoop, adjust ${hadoop2x-eclipse-plugin-master}/ivy/libraries.properties, to match hadoop dependency lib version.
2. modify ${hadoop3x-eclipse}/src/contrib/eclipse-plugin/build.xml, in the node: <attribute name="Bundle-ClassPath" .... to add the jar needed.
3. For eclipse, do **not** use eclipse-inst, install eclipse **FULL VERSION(SDK)** instead.
【Linux下Hadoop-eclipse-plus-3.2.0】编译Hadoop连接eclipse的插件遇见的一系列错误,崩溃的操作的更多相关文章
- Linux下查看Web服务器当前的并发连接数和TCP连接状态
对于web服务器(Nginx.Apache等)来说,并发连接数是一个比较重要的参数,下面就通过netstat命令和awk来查看web服务器的并发连接数以及TCP连接状态. $ netstat -n | ...
- Linux 下从头再走 GTK+-3.0 (一)
原本由于项目需求在 Linux 下学习过一段时间的 GTK+2.0 图形开发,时隔一段时间,想真正深入学习一下 GTK . 这次直接从头学习 GTK+-3.0 ,并写下博文便于日后查看,也方便新手入门 ...
- Linux下安装二进制版mysql-8.0.15
1.添加用户## 添加用户组groupadd mysql## 添加用户,指定用户home目录useradd -g mysql mysql -d /data/mysql## 解压下载的mysql二进制包 ...
- Linux下安装mysql(yum和源码编译两种方式)
这里介绍Linux下两种安装mysql的方式:yum安装和源码编译安装. 1. yum安装 (1)首先查看centos自带的mysql是否被安装: # yum list installed |grep ...
- libpqxx接口的在linux下的使用,解决psql:connections on Unix domain socket "/tmp/.s.PGSQL.5432"错误
在项目中使用postgresql数据库时要求在windows和linux双平台兼容.于是在windows下使用的接口在linux下爆出异常: psql:connections on Unix doma ...
- 解决Qt程序在Linux下无法输入中文的办法(与下文连接)
在安装QT集成开发工具包之前需要先安装build-essential和libncurses5-dev这两个开发工具和库,libncurses5-dev库是一个在Linux/Unix下广泛应用的图形函数 ...
- linux 下安装redis并用QT写客户端程序进行连接
1.安装redis.使用如下命令: wget http://dowload.redis.io/redis-stable.tar.gz tar xzf redis-stable.tar.gz cd ...
- linux下的第一个C程序及其编译方法
#include <stdio.h> #include <stdlib.h> int main(int argc, char ** argv) { printf(& ...
- Linux 下从头再走 GTK+-3.0 (六)
在 GTK3 中增加了一个 GtkApplicaton 类,便于我们处理多窗口程序,同时有了 GtkApplication 我们也更容易创建灵活,易用,界面美观的应用程序. 在前面的几个例子中,演示了 ...
随机推荐
- CentOS 7 安装FTP服务器(vsftpd)
FTP是安装各种环境前的预备环节,因为我们要把下载好的安装包上传上去.其次,在一个团队中,FTP服务器为多用户提供了一个文件储存场所,总之是一个非常实用的工具. 1.安装vsftpd # 首先要查看你 ...
- springMVC中controller的传参的几种案例
1.springmvc的controller方法不指定method时,默认get/post都支持 //@RequestMapping(value="test") //@Reques ...
- 001-脚手架发展,基础代码结构+mybatis代码生成
一.概述 脚手架是为了保证各施工过程顺利进行而搭设的工作平台. 编程领域中的“脚手架(Scaffolding)”指的是能够快速搭建项目“骨架”的一类工具. java变成中,架构师搭建的代码结构你到处拷 ...
- Java 使用ZkClient操作Zookeeper
目录 ZkClient介绍 导入jar包依赖 简单使用样例 ZkClient介绍 因为Zookeeper API比较复杂,使用并不方便,所以出现了ZkClient,ZkClient对Zookeeper ...
- [LeetCode] 167. Two Sum II - Input array is sorted 两数和 II - 输入是有序的数组
Given an array of integers that is already sorted in ascending order, find two numbers such that the ...
- DHCP的配置方法
1.实验拓扑图: 2.配置命令: AR2: <Huawei>system-view[Huawei]dhcp enable #开启DHCP服务[Huawei]interface G ...
- LeetCode 976. 三角形的最大周长(Largest Perimeter Triangle) 33
976. 三角形的最大周长 976. Largest Perimeter Triangle 题目描述 给定由一些正数(代表长度)组成的数组 A,返回由其中三个长度组成的.面积不为零的三角形的最大周长. ...
- 大型web项目构建之负载均衡
日常开发和学习中经常会听到或者会看到“负载均衡”这个词汇,但是对于很多初级每天只面对增删改代码的开发人员来说,这个词汇好像离我们很遥远又很接近,很多人多多少少都有点一知半解 我结合以前在开发中遇到的场 ...
- Django模板语言中静态文件路径的灵活写法
如图,我们看到的时html页面中静态文件的路径,其中/static/是settings.py中的设置: 假设我们将settings.py中的/static/改变了,这样的话我们还需要将html中的/s ...
- python实战项目 — 爬取 妹子图网,保存图片到本地
重点: 1. 用def函数 2. 使用 os.path.dirname("路径保存") , 实现每组图片保存在独立的文件夹中 方法1: import requests from l ...