这段时间学习Mahout有喜有悲。在这里首先感谢樊哲老师的指导。以下列出关于这次Mahout分类的学习和遇到的问题,还请大家多多提出建议:(全部文件操作都使用是在hdfs上边进行的)。

(本人用的环境是Mahout0.9+hadoop-2.2.0)

一、首先将预分类文件转换为序列化化存储:

下边图片列出的是使用的20newsgroup数据(我使用的linux上的eclipse。然后在eclipse上边安装的eclipse-hadoop插件),数据图片例如以下:

watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvbWlsbG1hbnho/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast" alt="">

然后编写java代码将此20newsgroup分类文件转换为sequence文件存储,例如以下图:

watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvbWlsbG1hbnho/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast" alt="">

运行上边的程序,最后生成下列序列化文件:

敲代码查看序列化文件的内容例如以下:

二、将序列化文件转储为向量文件:

输入上边的序列化文件的输出结果将序列化文件转换为向量文件。转换java代码例如以下:

watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvbWlsbG1hbnho/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast" alt="">

转换为向量文件结果例如以下图所看到的:

watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvbWlsbG1hbnho/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast" alt="">

使用java程序查看向量文件的tfidf-vectors下的part-m-00000内容例如以下(一部分结果):

key= /hdfs://h1:9000/classifier/classifier-src4/20news-bydate-train/sci.med/58064   value= /hdfs://h1:9000/classifier/classifier-src4/20news-bydate-train/sci.med/58064:{40949:4.404207229614258,58885:6.966500282287598,52884:1.404096007347107,54411:6.043336868286133,25074:16.67474365234375,20175:6.222922325134277,32290:7.155742168426514,57777:6.696209907531738,59820:5.825411796569824,53261:5.4359564781188965,15068:9.235183715820312,68017:4.232685565948486,47050:9.235183715820312,39193:8.724358558654785,56670:3.387782096862793,69370:1.5702117681503296,21521:7.7688469886779785,54393:5.597597599029541,62286:4.329909324645996,11644:7.128331661224365,12511:1.7634414434432983,17861:4.079967498779297,39895:7.338063716888428,55445:7.62574577331543,50765:9.235183715820312,41274:7.338063716888428,19358:7.155742168426514,19837:9.235183715820312,46237:9.235183715820312,24355:2.2430875301361084,52769:7.289273738861084,24217:1.9966870546340942,19749:3.0007731914520264,28651:7.114920139312744,24284:2.4492197036743164,61132:6.2394514083862305,39146:5.458599090576172,34340:7.289273738861084,34936:7.443424224853516,10911:6.129103660583496,12647:8.254354476928711,47554:1.0402182340621948,40788:7.338063716888428,10482:7.037959098815918,62155:3.3696606159210205,33813:1.230707049369812,24044:8.947502136230469,63344:5.73867654800415,68080:6.2394514083862305,17029:5.118860244750977,65110:7.075699806213379,32127:5.7694478034973145,30288:3.71773099899292,13537:3.087428092956543,11545:8.254354476928711,47708:7.935900688171387,47276:2.321446418762207,24045:8.947502136230469,56652:3.911620616912842,10107:12.653678894042969,10233:3.475231170654297,34068:5.321562767028809,58884:7.338063716888428,68165:1.6761456727981567,40591:9.235183715820312,47733:11.100005149841309,58624:4.493154525756836,50783:11.100005149841309,44628:6.572596073150635}

key= /hdfs://h1:9000/classifier/classifier-src4/20news-bydate-train/talk.politics.guns/54390   value= /hdfs://h1:9000/classifier/classifier-src4/20news-bydate-train/talk.politics.guns/54390:{25902:9.235183715820312,47590:4.066595554351807,48033:3.2437193393707275,67021:2.7126009464263916,24472:8.947502136230469,44782:1.9297716617584229,42373:5.015676021575928,24015:2.4575374126434326,11106:1.9394488334655762,42273:2.070721387863159,62394:3.628157138824463,61226:7.848889350891113,44453:2.7429440021514893,21501:15.389477729797363,32332:3.7448697090148926,64726:1.9358370304107666,48742:7.499622821807861,60003:15.995806694030762,50448:4.258450031280518,14327:3.194135904312134,50798:7.7688469886779785,47387:3.7190706729888916,47554:1.0402182340621948,11201:3.9916746616363525,53791:2.5926971435546875,19329:7.785928249359131,52953:2.958540439605713,13970:5.588863849639893,46327:8.387886047363281,58697:4.107259273529053,49227:15.995806694030762,18911:6.383354663848877,50439:3.510509967803955,9861:1.9852583408355713,56602:3.647935152053833,50458:1.7995492219924927,36905:5.748828887939453,66718:5.264892101287842,33813:2.7519447803497314,68017:2.9929606914520264,23442:3.458564043045044,1890:3.5897369384765625,44013:5.357062339782715,35455:3.657973051071167,65123:2.995558023452759,56080:7.561207294464111,40309:4.490251541137695,26572:8.628969192504883,23439:3.0159199237823486,27894:5.060796737670898,46052:1.8622281551361084,18320:5.84535026550293,17803:3.757326602935791,33291:1.8489196300506592}

key= /hdfs://h1:9000/classifier/classifier-src4/20news-bydate-train/comp.windows.x/67312   value= /hdfs://h1:9000/classifier/classifier-src4/20news-bydate-train/comp.windows.x/67312:{47554:1.0402182340621948,6258:3.1925511360168457,33291:1.8489196300506592,788:5.197997570037842,32485:4.425988674163818,53061:4.7731146812438965,68774:6.114288330078125,4487:4.123196125030518,65155:4.61021089553833,65670:5.0815229415893555,5285:10.784433364868164,50458:1.7995492219924927,35455:5.173154830932617,46052:1.8622281551361084,27311:8.298446655273438,8749:3.7985548973083496,26321:6.401970386505127,4587:13.633935928344727,1109:3.6212668418884277,34867:6.085300922393799,41201:3.171398639678955,25533:13.633935928344727}

三、生成训练模型:

输入上边生成的tfidf-vectors下的part-m-00000文件生成训练模型详细代码例如以下:

生成结果例如以下图所看到的:

(1)、训练模型

(2)、indexlabel文件

经过查看indexlabel的内容发现indexlabel不对。

indexlabel的内容例如以下:(显然不对不知道为什么。网上搜了非常多内容都没有解决掉错误)

key= hdfs:   value= 0

自己尝试的解决的方法改动Mahout源代码的org.apache.mahout.classifier.naivebayes.BayesUtils.java中代码toString())[1]改为toString())[7],例如以下图所看到的:

这里改动这个数组下标的原因是BayesUtils.java默认使用“/”来分隔文件夹。数组的toString())[7]正好是所须要的文件类型标识。

watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvbWlsbG1hbnho/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast" alt="">

改为:

watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvbWlsbG1hbnho/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast" alt="">

这时候indexlabel文件内容为(总共20个类型标识显示正确)。例如以下图所看到的:

但是在运行训练模型的时候又出现了新的错误。。(例如以下运行过程所看到的):

2014-07-21 20:45:27,738 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir

2014-07-21 20:45:27,738 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.compress.map.output is deprecated. Instead, use mapreduce.map.output.compress

2014-07-21 20:45:27,738 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir

2014-07-21 20:45:27,837 INFO  [main] client.RMProxy (RMProxy.java:createRMProxy(56)) - Connecting to ResourceManager at h1/192.168.1.130:8032

2014-07-21 20:45:28,085 WARN  [main] mapreduce.JobSubmitter (JobSubmitter.java:copyAndConfigureFiles(149)) - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.

2014-07-21 20:45:28,963 INFO  [main] input.FileInputFormat (FileInputFormat.java:listStatus(287)) - Total input paths to process : 1

2014-07-21 20:45:29,025 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:submitJobInternal(394)) - number of splits:1

2014-07-21 20:45:29,036 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - user.name is deprecated. Instead, use mapreduce.job.user.name

2014-07-21 20:45:29,037 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.jar is deprecated. Instead, use mapreduce.job.jar

2014-07-21 20:45:29,037 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.cache.files.filesizes is deprecated. Instead, use mapreduce.job.cache.files.filesizes

2014-07-21 20:45:29,037 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.cache.files is deprecated. Instead, use mapreduce.job.cache.files

2014-07-21 20:45:29,038 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.output.value.class is deprecated. Instead, use mapreduce.job.output.value.class

2014-07-21 20:45:29,038 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.mapoutput.value.class is deprecated. Instead, use mapreduce.map.output.value.class

2014-07-21 20:45:29,038 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapreduce.combine.class is deprecated. Instead, use mapreduce.job.combine.class

2014-07-21 20:45:29,045 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class

2014-07-21 20:45:29,045 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.job.name is deprecated. Instead, use mapreduce.job.name

2014-07-21 20:45:29,046 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapreduce.reduce.class is deprecated. Instead, use mapreduce.job.reduce.class

2014-07-21 20:45:29,046 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapreduce.inputformat.class is deprecated. Instead, use mapreduce.job.inputformat.class

2014-07-21 20:45:29,046 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapreduce.outputformat.class is deprecated. Instead, use mapreduce.job.outputformat.class

2014-07-21 20:45:29,046 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps

2014-07-21 20:45:29,046 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.cache.files.timestamps is deprecated. Instead, use mapreduce.job.cache.files.timestamps

2014-07-21 20:45:29,046 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.output.key.class is deprecated. Instead, use mapreduce.job.output.key.class

2014-07-21 20:45:29,047 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.mapoutput.key.class is deprecated. Instead, use mapreduce.map.output.key.class

2014-07-21 20:45:29,047 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir

2014-07-21 20:45:29,110 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:printTokens(477)) - Submitting tokens for job: job_1405911825122_0013

2014-07-21 20:45:29,310 INFO  [main] impl.YarnClientImpl (YarnClientImpl.java:submitApplication(174)) - Submitted application application_1405911825122_0013 to ResourceManager at h1/192.168.1.130:8032

2014-07-21 20:45:29,349 INFO  [main] mapreduce.Job (Job.java:submit(1272)) - The url to track the job: http://h1:8088/proxy/application_1405911825122_0013/

2014-07-21 20:45:29,349 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1317)) - Running job: job_1405911825122_0013

2014-07-21 20:45:34,926 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1338)) - Job job_1405911825122_0013 running in uber mode : false

2014-07-21 20:45:34,928 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1345)) -  map 0% reduce 0%

2014-07-21 20:45:44,229 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1345)) -  map 100% reduce 0%

2014-07-21 20:45:49,803 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1345)) -  map 100% reduce 100%

2014-07-21 20:45:49,818 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1356)) - Job job_1405911825122_0013 completed successfully

2014-07-21 20:45:49,910 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1363)) - Counters: 44

    File System Counters

        FILE: Number of bytes read=680

        FILE: Number of bytes written=161999

        FILE: Number of read operations=0

        FILE: Number of large read operations=0

        FILE: Number of write operations=0

        HDFS: Number of bytes read=18538518

        HDFS: Number of bytes written=97

        HDFS: Number of read operations=7

        HDFS: Number of large read operations=0

        HDFS: Number of write operations=2

    Job Counters

        Launched map tasks=1

        Launched reduce tasks=1

        Data-local map tasks=1

        Total time spent by all maps in occupied slots (ms)=42768

        Total time spent by all reduces in occupied slots (ms)=2352

    Map-Reduce Framework

        Map input records=11314

        Map output records=0

        Map output bytes=0

        Map output materialized bytes=14

        Input split bytes=151

        Combine input records=0

        Combine output records=0

        Reduce input groups=0

        Reduce shuffle bytes=14

        Reduce input records=0

        Reduce output records=0

        Spilled Records=0

        Shuffled Maps =1

        Failed Shuffles=0

        Merged Map outputs=1

        GC time elapsed (ms)=427

        CPU time spent (ms)=3470

        Physical memory (bytes) snapshot=270143488

        Virtual memory (bytes) snapshot=808521728

        Total committed heap usage (bytes)=201457664

    Shuffle Errors

        BAD_ID=0

        CONNECTION=0

        IO_ERROR=0

        WRONG_LENGTH=0

        WRONG_MAP=0

        WRONG_REDUCE=0

    File Input Format Counters

        Bytes Read=18538367

    File Output Format Counters

        Bytes Written=97

    org.apache.mahout.classifier.naivebayes.training.IndexInstancesMapper$Counter

        SKIPPED_INSTANCES=11314

2014-07-21 20:45:49,934 INFO  [main] client.RMProxy (RMProxy.java:createRMProxy(56)) - Connecting to ResourceManager at h1/192.168.1.130:8032

2014-07-21 20:45:49,961 WARN  [main] mapreduce.JobSubmitter (JobSubmitter.java:copyAndConfigureFiles(149)) - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.

2014-07-21 20:45:50,218 INFO  [main] input.FileInputFormat (FileInputFormat.java:listStatus(287)) - Total input paths to process : 1

2014-07-21 20:45:50,245 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:submitJobInternal(394)) - number of splits:1

2014-07-21 20:45:50,853 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:printTokens(477)) - Submitting tokens for job: job_1405911825122_0014

2014-07-21 20:45:50,905 INFO  [main] impl.YarnClientImpl (YarnClientImpl.java:submitApplication(174)) - Submitted application application_1405911825122_0014 to ResourceManager at h1/192.168.1.130:8032

2014-07-21 20:45:50,908 INFO  [main] mapreduce.Job (Job.java:submit(1272)) - The url to track the job: http://h1:8088/proxy/application_1405911825122_0014/

2014-07-21 20:45:50,908 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1317)) - Running job: job_1405911825122_0014

2014-07-21 20:46:01,907 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1338)) - Job job_1405911825122_0014 running in uber mode : false

2014-07-21 20:46:01,907 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1345)) -  map 0% reduce 0%

2014-07-21 20:46:06,258 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1345)) -  map 100% reduce 0%

2014-07-21 20:46:11,815 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1345)) -  map 100% reduce 100%

2014-07-21 20:46:11,824 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1356)) - Job job_1405911825122_0014 completed successfully

2014-07-21 20:46:11,852 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1363)) - Counters: 43

    File System Counters

        FILE: Number of bytes read=22

        FILE: Number of bytes written=162241

        FILE: Number of read operations=0

        FILE: Number of large read operations=0

        FILE: Number of write operations=0

        HDFS: Number of bytes read=237

        HDFS: Number of bytes written=90

        HDFS: Number of read operations=7

        HDFS: Number of large read operations=0

        HDFS: Number of write operations=2

    Job Counters

        Launched map tasks=1

        Launched reduce tasks=1

        Data-local map tasks=1

        Total time spent by all maps in occupied slots (ms)=12960

        Total time spent by all reduces in occupied slots (ms)=2214

    Map-Reduce Framework

        Map input records=0

        Map output records=0

        Map output bytes=0

        Map output materialized bytes=14

        Input split bytes=140

        Combine input records=0

        Combine output records=0

        Reduce input groups=0

        Reduce shuffle bytes=14

        Reduce input records=0

        Reduce output records=0

        Spilled Records=0

        Shuffled Maps =1

        Failed Shuffles=0

        Merged Map outputs=1

        GC time elapsed (ms)=31

        CPU time spent (ms)=1390

        Physical memory (bytes) snapshot=300437504

        Virtual memory (bytes) snapshot=809414656

        Total committed heap usage (bytes)=225509376

    Shuffle Errors

        BAD_ID=0

        CONNECTION=0

        IO_ERROR=0

        WRONG_LENGTH=0

        WRONG_MAP=0

        WRONG_REDUCE=0

    File Input Format Counters

        Bytes Read=97

    File Output Format Counters

        Bytes Written=90

Exception in thread "main" java.lang.NullPointerException

    at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:187)

    at org.apache.mahout.classifier.naivebayes.BayesUtils.readModelFromDir(BayesUtils.java:81)

    at org.apache.mahout.classifier.naivebayes.training.TrainNaiveBayesJob.run(TrainNaiveBayesJob.java:162)

    at com.redhadoop.trainnb.TrainnbTest.main(TrainnbTest.java:48)

所指的BayesUtils.java 81行的代码例如以下:

四、使用训练模型进行測试

測试的java代码例如以下:

測试结果(明显不对):

在这一块奋斗好几个日日夜夜,但是错误(上面的红色部分)还是没有解决。还是无奈。。求大神帮忙o(╯□╰)o。

mahout分类学习和遇到的问题总结的更多相关文章

  1. mahout分类

    分类看起来比聚类和推荐麻烦多了 分类算法与聚类和推荐算法的不同:必须是有明确结果的,必须是有监督的,主要用于预测和检测 Mahout的优势 mahout的分类算法对资源的要求不会快于训练数据和测试数据 ...

  2. 文本分类学习 (五) 机器学习SVM的前奏-特征提取(卡方检验续集)

    前言: 上一篇比较详细的介绍了卡方检验和卡方分布.这篇我们就实际操刀,找到一些训练集,正所谓纸上得来终觉浅,绝知此事要躬行.然而我在躬行的时候,发现了卡方检验对于文本分类来说应该把公式再变形一般,那样 ...

  3. 文本分类学习 (七)支持向量机SVM 的前奏 结构风险最小化和VC维度理论

    前言: 经历过文本的特征提取,使用LibSvm工具包进行了测试,Svm算法的效果还是很好的.于是开始逐一的去了解SVM的原理. SVM 是在建立在结构风险最小化和VC维理论的基础上.所以这篇只介绍关于 ...

  4. 文本分类学习 (十)构造机器学习Libsvm 的C# wrapper(调用c/c++动态链接库)

    前言: 对于SVM的了解,看前辈写的博客加上读论文对于SVM的皮毛知识总算有点了解,比如线性分类器,和求凸二次规划中用到的高等数学知识.然而SVM最核心的地方应该在于核函数和求关于α函数的极值的方法: ...

  5. Reshape以及向量机分类学习和等高线绘制代码

    首先科普一下python里面对于数组的处理,就是如果获取数组大小,以及数组元素数量,这个概念是不一样的,就是一个size和len处理不用.老规矩,上代码: arr2 = np.array([-19.5 ...

  6. TensorFlow基础笔记(3) cifar10 分类学习

    TensorFlow基础笔记(3) cifar10 分类学习 CIFAR-10 is a common benchmark in machine learning for image recognit ...

  7. Mahout 分类算法

    实验简介 本次课程学习了Mahout 的 Bayes 分类算法. 一.实验环境说明 1. 环境登录 无需密码自动登录,系统用户名 shiyanlou 2. 环境介绍 本实验环境采用带桌面的Ubuntu ...

  8. ARM指令分类学习

    指令分类: 1.算数和逻辑指令 2.比较指令 3.跳转指令 4.移位指令 5.程序状态字访问指令 6.存储器访问指令 +++++++++++++++++++++++++++++++++++++++++ ...

  9. 文本分类学习(三) 特征权重(TF/IDF)和特征提取

    上一篇中,主要说的就是词袋模型.回顾一下,在进行文本分类之前,我们需要把待分类文本先用词袋模型进行文本表示.首先是将训练集中的所有单词经过去停用词之后组合成一个词袋,或者叫做字典,实际上一个维度很大的 ...

随机推荐

  1. MongoDB 3.2 在windows上的安装

    翻译自 https://docs.mongodb.org/master/tutorial/install-mongodb-on-windows/ 在windows上安装 MongoDB 平台支持:从M ...

  2. 7.1 Java中的堆和栈

    栈与堆都是Java用来在Ram中存放数据的地方.Java自动管理栈和堆,程序员不能直接地设置栈或堆. Java的堆是一个运行时数据区,类的对象从中分配空间.这些对象通过new.newarray.ane ...

  3. 在cocos2d里面如何使用Texture Packer和像素格式来优化spritesheet

    免责申明(必读!):本博客提供的所有教程的翻译原稿均来自于互联网,仅供学习交流之用,切勿进行商业传播.同时,转载时不要移除本申明.如产生任何纠纷,均与本博客所有人.发表该翻译稿之人无任何关系.谢谢合作 ...

  4. socket进阶

    socketserver(在Python2.*中的是SocketServer模块)是标准库中一个高级别的模块.用于简化网络客户与服务器的实现(在前面使用socket的过程中,我们先设置了socket的 ...

  5. C++混合编程之idlcpp教程Python篇(3)

    上一篇 C++混合编程之idlcpp教程Python篇(2) 是一个 hello world 的例子,仅仅涉及了静态函数的调用.这一篇会有新的内容. 与PythonTutorial0相似,工程Pyth ...

  6. Upstart 1.10 发布,系统初始化守护进程

    Upstart 是一个用以替换 /sbin/init 守护进程的软件,基于事件机制开发.可用来处理启动过程中的任务和服务启动. Upstart 1.10 发布,改进记录: New bridges: u ...

  7. 《JavaScript高级程序设计》学习笔记12篇

    写在前面: 这12篇博文不是给人看的,而是用来查的,忘记了什么基础知识,点开页面Ctrl + F关键字就好了 P.S.如果在对应分类里没有找到,麻烦告诉我,以便尽快添上.当然,我也会时不时地添点遗漏的 ...

  8. Spec模板

    Spec模板      一.概述   1.项目背景    图书馆在正常运营中面对大量书籍.读者信息以及两者间相互联系产生的借书信息.还书信息.现有的人工记录方法既效率低又错误过多,大大影响了图书馆   ...

  9. MySql、SqlServer、Oracle 三种数据库查询分页方式

    SQL Server关于分页 SQL 的资料许多,有的使用存储过程,有的使用游标.本人不喜欢使用游标,我觉得它耗资.效率低:使用存储过程是个不错的选择,因为存储过程是颠末预编译的,执行效率高,也更灵活 ...

  10. codepage IMLangCodePages

      http://baike.baidu.com/link?url=78DSTGAri8dvHNLQ03rThSKieJqhFwFWL4sQMao6cfaRSOUWN88QVBwmSJPCZch0vf ...