安装Spark需要先安装jdk安装Scala

1. 创建目录

> mkdir  /opt/spark

> cd  /opt/spark

2. 解压缩、创建软连接

> tar  zxvf  spark-2.3.0-bin-hadoop2.7.tgz

> link -s spark-2.3.0-bin-hadoop2.7  spark

4. 编辑 /etc/profile

> vi /etc/profile

输入下面内容

export SPARK_HOME=/opt/spark/spark
export PATH=$PATH:$SPARK_HOME/bin

> source  /etc/profile

5. 进入配置文件夹

> cd /opt/spark/spark/conf

6. 配置spark-env.sh

> cp spark-env.sh.template spark-env.sh

spark-env.sh 中输入以下内容

export SCALA_HOME=/opt/scala/scala
export JAVA_HOME=/opt/java/jdk
export SPARK_HOME=/opt/spark/spark
export SPARK_MASTER_IP=hserver1
export SPARK_EXECUTOR_MEMORY=1G

注意:上面的路径应该根据自己的路径配置

7. 配置slaves

> cp slaves.template  slaves

slaves 中输入以下内容

localhost

8. 运行spark示例

> cd /opt/spark/spark

> ./bin/run-example SparkPi 10

会显示下面信息

[aston@localhost spark]$ ./bin/run-example SparkPi 10
2018-06-04 22:37:25 WARN Utils:66 - Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 192.168.199.150 instead (on interface wlp8s0b1)
2018-06-04 22:37:25 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2018-06-04 22:37:25 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-06-04 22:37:25 INFO SparkContext:54 - Running Spark version 2.3.0
2018-06-04 22:37:25 INFO SparkContext:54 - Submitted application: Spark Pi
2018-06-04 22:37:26 INFO SecurityManager:54 - Changing view acls to: aston
2018-06-04 22:37:26 INFO SecurityManager:54 - Changing modify acls to: aston
2018-06-04 22:37:26 INFO SecurityManager:54 - Changing view acls groups to:
2018-06-04 22:37:26 INFO SecurityManager:54 - Changing modify acls groups to:
2018-06-04 22:37:26 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(aston); groups with view permissions: Set(); users with modify permissions: Set(aston); groups with modify permissions: Set()
2018-06-04 22:37:26 INFO Utils:54 - Successfully started service 'sparkDriver' on port 34729.
2018-06-04 22:37:26 INFO SparkEnv:54 - Registering MapOutputTracker
2018-06-04 22:37:26 INFO SparkEnv:54 - Registering BlockManagerMaster
2018-06-04 22:37:26 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2018-06-04 22:37:26 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2018-06-04 22:37:26 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-4d51d515-85db-4a8c-bb45-219fd96be3c6
2018-06-04 22:37:26 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB
2018-06-04 22:37:26 INFO SparkEnv:54 - Registering OutputCommitCoordinator
2018-06-04 22:37:26 INFO log:192 - Logging initialized @2296ms
2018-06-04 22:37:26 INFO Server:346 - jetty-9.3.z-SNAPSHOT
2018-06-04 22:37:26 INFO Server:414 - Started @2382ms
2018-06-04 22:37:26 INFO AbstractConnector:278 - Started ServerConnector@779dfe55{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-06-04 22:37:26 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040.
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5f212d84{/jobs,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@27ead29e{/jobs/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4c060c8f{/jobs/job,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@383f3558{/jobs/job/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@49b07ee3{/stages,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@352e612e{/stages/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@65f00478{/stages/stage,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@28486680{/stages/stage/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4d7e7435{/stages/pool,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4a1e3ac1{/stages/pool/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6e78fcf5{/storage,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@56febdc{/storage/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3b8ee898{/storage/rdd,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7d151a{/storage/rdd/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@294bdeb4{/environment,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5300f14a{/environment/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1f86099a{/executors,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@77bb0ab5{/executors/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@f2c488{/executors/threadDump,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@54acff7d{/executors/threadDump/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7bc9e6ab{/static,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@37d00a23{/,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@433e536f{/api,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@988246e{/jobs/job/kill,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@62515a47{/stages/stage/kill,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://192.168.199.150:4040
2018-06-04 22:37:26 INFO SparkContext:54 - Added JAR file:///opt/spark/spark/examples/jars/spark-examples_2.11-2.3.0.jar at spark://192.168.199.150:34729/jars/spark-examples_2.11-2.3.0.jar with timestamp 1528123046779
2018-06-04 22:37:26 INFO SparkContext:54 - Added JAR file:///opt/spark/spark/examples/jars/scopt_2.11-3.7.0.jar at spark://192.168.199.150:34729/jars/scopt_2.11-3.7.0.jar with timestamp 1528123046780
2018-06-04 22:37:26 INFO Executor:54 - Starting executor ID driver on host localhost
2018-06-04 22:37:26 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 45436.
2018-06-04 22:37:26 INFO NettyBlockTransferService:54 - Server created on 192.168.199.150:45436
2018-06-04 22:37:26 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2018-06-04 22:37:26 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, 192.168.199.150, 45436, None)
2018-06-04 22:37:26 INFO BlockManagerMasterEndpoint:54 - Registering block manager 192.168.199.150:45436 with 366.3 MB RAM, BlockManagerId(driver, 192.168.199.150, 45436, None)
2018-06-04 22:37:26 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, 192.168.199.150, 45436, None)
2018-06-04 22:37:26 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, 192.168.199.150, 45436, None)
2018-06-04 22:37:27 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@65bcf7c2{/metrics/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:27 INFO SparkContext:54 - Starting job: reduce at SparkPi.scala:38
2018-06-04 22:37:27 INFO DAGScheduler:54 - Got job 0 (reduce at SparkPi.scala:38) with 10 output partitions
2018-06-04 22:37:27 INFO DAGScheduler:54 - Final stage: ResultStage 0 (reduce at SparkPi.scala:38)
2018-06-04 22:37:27 INFO DAGScheduler:54 - Parents of final stage: List()
2018-06-04 22:37:27 INFO DAGScheduler:54 - Missing parents: List()
2018-06-04 22:37:27 INFO DAGScheduler:54 - Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34), which has no missing parents
2018-06-04 22:37:27 INFO MemoryStore:54 - Block broadcast_0 stored as values in memory (estimated size 1832.0 B, free 366.3 MB)
2018-06-04 22:37:28 INFO MemoryStore:54 - Block broadcast_0_piece0 stored as bytes in memory (estimated size 1181.0 B, free 366.3 MB)
2018-06-04 22:37:28 INFO BlockManagerInfo:54 - Added broadcast_0_piece0 in memory on 192.168.199.150:45436 (size: 1181.0 B, free: 366.3 MB)
2018-06-04 22:37:28 INFO SparkContext:54 - Created broadcast 0 from broadcast at DAGScheduler.scala:1039
2018-06-04 22:37:28 INFO DAGScheduler:54 - Submitting 10 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))
2018-06-04 22:37:28 INFO TaskSchedulerImpl:54 - Adding task set 0.0 with 10 tasks
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 1.0 in stage 0.0 (TID 1, localhost, executor driver, partition 1, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 2.0 in stage 0.0 (TID 2, localhost, executor driver, partition 2, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 3.0 in stage 0.0 (TID 3, localhost, executor driver, partition 3, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO Executor:54 - Running task 2.0 in stage 0.0 (TID 2)
2018-06-04 22:37:28 INFO Executor:54 - Running task 1.0 in stage 0.0 (TID 1)
2018-06-04 22:37:28 INFO Executor:54 - Running task 3.0 in stage 0.0 (TID 3)
2018-06-04 22:37:28 INFO Executor:54 - Running task 0.0 in stage 0.0 (TID 0)
2018-06-04 22:37:28 INFO Executor:54 - Fetching spark://192.168.199.150:34729/jars/scopt_2.11-3.7.0.jar with timestamp 1528123046780
2018-06-04 22:37:28 INFO TransportClientFactory:267 - Successfully created connection to /192.168.199.150:34729 after 34 ms (0 ms spent in bootstraps)
2018-06-04 22:37:28 INFO Utils:54 - Fetching spark://192.168.199.150:34729/jars/scopt_2.11-3.7.0.jar to /tmp/spark-a840c54e-7db9-4dfc-a446-1fa10a8d2c3e/userFiles-36ae13de-60e8-42fd-958d-66c3c3832d4a/fetchFileTemp8606784681518533462.tmp
2018-06-04 22:37:28 INFO Executor:54 - Adding file:/tmp/spark-a840c54e-7db9-4dfc-a446-1fa10a8d2c3e/userFiles-36ae13de-60e8-42fd-958d-66c3c3832d4a/scopt_2.11-3.7.0.jar to class loader
2018-06-04 22:37:28 INFO Executor:54 - Fetching spark://192.168.199.150:34729/jars/spark-examples_2.11-2.3.0.jar with timestamp 1528123046779
2018-06-04 22:37:28 INFO Utils:54 - Fetching spark://192.168.199.150:34729/jars/spark-examples_2.11-2.3.0.jar to /tmp/spark-a840c54e-7db9-4dfc-a446-1fa10a8d2c3e/userFiles-36ae13de-60e8-42fd-958d-66c3c3832d4a/fetchFileTemp8435156876449095794.tmp
2018-06-04 22:37:28 INFO Executor:54 - Adding file:/tmp/spark-a840c54e-7db9-4dfc-a446-1fa10a8d2c3e/userFiles-36ae13de-60e8-42fd-958d-66c3c3832d4a/spark-examples_2.11-2.3.0.jar to class loader
2018-06-04 22:37:28 INFO Executor:54 - Finished task 0.0 in stage 0.0 (TID 0). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO Executor:54 - Finished task 1.0 in stage 0.0 (TID 1). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 4.0 in stage 0.0 (TID 4, localhost, executor driver, partition 4, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO Executor:54 - Finished task 2.0 in stage 0.0 (TID 2). 867 bytes result sent to driver
2018-06-04 22:37:28 INFO Executor:54 - Running task 4.0 in stage 0.0 (TID 4)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 5.0 in stage 0.0 (TID 5, localhost, executor driver, partition 5, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO Executor:54 - Running task 5.0 in stage 0.0 (TID 5)
2018-06-04 22:37:28 INFO Executor:54 - Finished task 3.0 in stage 0.0 (TID 3). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 6.0 in stage 0.0 (TID 6, localhost, executor driver, partition 6, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 7.0 in stage 0.0 (TID 7, localhost, executor driver, partition 7, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO Executor:54 - Running task 7.0 in stage 0.0 (TID 7)
2018-06-04 22:37:28 INFO Executor:54 - Running task 6.0 in stage 0.0 (TID 6)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 1.0 in stage 0.0 (TID 1) in 362 ms on localhost (executor driver) (1/10)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 3.0 in stage 0.0 (TID 3) in 385 ms on localhost (executor driver) (2/10)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 0.0 in stage 0.0 (TID 0) in 418 ms on localhost (executor driver) (3/10)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 2.0 in stage 0.0 (TID 2) in 388 ms on localhost (executor driver) (4/10)
2018-06-04 22:37:28 INFO Executor:54 - Finished task 5.0 in stage 0.0 (TID 5). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 8.0 in stage 0.0 (TID 8, localhost, executor driver, partition 8, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 5.0 in stage 0.0 (TID 5) in 79 ms on localhost (executor driver) (5/10)
2018-06-04 22:37:28 INFO Executor:54 - Running task 8.0 in stage 0.0 (TID 8)
2018-06-04 22:37:28 INFO Executor:54 - Finished task 4.0 in stage 0.0 (TID 4). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 9.0 in stage 0.0 (TID 9, localhost, executor driver, partition 9, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO Executor:54 - Running task 9.0 in stage 0.0 (TID 9)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 4.0 in stage 0.0 (TID 4) in 99 ms on localhost (executor driver) (6/10)
2018-06-04 22:37:28 INFO Executor:54 - Finished task 7.0 in stage 0.0 (TID 7). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 7.0 in stage 0.0 (TID 7) in 98 ms on localhost (executor driver) (7/10)
2018-06-04 22:37:28 INFO Executor:54 - Finished task 6.0 in stage 0.0 (TID 6). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 6.0 in stage 0.0 (TID 6) in 107 ms on localhost (executor driver) (8/10)
2018-06-04 22:37:28 INFO Executor:54 - Finished task 9.0 in stage 0.0 (TID 9). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO Executor:54 - Finished task 8.0 in stage 0.0 (TID 8). 867 bytes result sent to driver
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 9.0 in stage 0.0 (TID 9) in 39 ms on localhost (executor driver) (9/10)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 8.0 in stage 0.0 (TID 8) in 57 ms on localhost (executor driver) (10/10)
2018-06-04 22:37:28 INFO TaskSchedulerImpl:54 - Removed TaskSet 0.0, whose tasks have all completed, from pool
2018-06-04 22:37:28 INFO DAGScheduler:54 - ResultStage 0 (reduce at SparkPi.scala:38) finished in 0.800 s
2018-06-04 22:37:28 INFO DAGScheduler:54 - Job 0 finished: reduce at SparkPi.scala:38, took 0.945853 s
Pi is roughly 3.14023914023914
2018-06-04 22:37:28 INFO AbstractConnector:318 - Stopped Spark@779dfe55{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-06-04 22:37:28 INFO SparkUI:54 - Stopped Spark web UI at http://192.168.199.150:4040
2018-06-04 22:37:28 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2018-06-04 22:37:28 INFO MemoryStore:54 - MemoryStore cleared
2018-06-04 22:37:28 INFO BlockManager:54 - BlockManager stopped
2018-06-04 22:37:28 INFO BlockManagerMaster:54 - BlockManagerMaster stopped
2018-06-04 22:37:28 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2018-06-04 22:37:28 INFO SparkContext:54 - Successfully stopped SparkContext
2018-06-04 22:37:28 INFO ShutdownHookManager:54 - Shutdown hook called
2018-06-04 22:37:28 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-a840c54e-7db9-4dfc-a446-1fa10a8d2c3e
2018-06-04 22:37:28 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-16300765-9872-4542-91ed-1a7a0f8285d9

9. 运行spark shell

> cd /opt/spark/spark

> ./bin/spark-shell

Linux单机安转Spark的更多相关文章

  1. 制作linux内核安装包

    实验基于Centos 6.2 升级linux内核 直接在一个有编译环境的设备上,编译升级内核很简单. make menuconfig 或者 拷贝现有系统的.config文件 修改.config文件  ...

  2. linux多种安装包格式的安装方法

    linux多种安装包格式的安装方法 一.rpm包安装方式步骤:1.找到相应的软件包,比如soft.version.rpm,下载到本机某个目录: 2.打开一个终端,su -成root用户: 3.cd s ...

  3. 寻找Linux单机负载瓶颈

    寻找Linux单机负载瓶颈 服务器性能上不去,是哪里出了问题?IO还是CPU?只有找到瓶颈点,才能对症下药: 如何寻找Linux单机负载瓶颈,遵循的原则是不要推测,我们要通过测量的数据说话: 负载分两 ...

  4. Linux单机环境下HDFS伪分布式集群安装操作步骤v1.0

    公司平台的分布式文件系统基于Hadoop HDFS技术构建,为开发人员学习及后续项目中Hadoop HDFS相关操作提供技术参考特编写此文档.本文档描述了Linux单机环境下Hadoop HDFS伪分 ...

  5. Linux下安裝Oracle database內核參數設置

    參考:1529864.1 ************************************************** RAM                                  ...

  6. Linux 查找安装包所在目录的常用方法

    1. which命令查找出相关命令是否已经在搜索路径中: $which gcc //显示出GNC的C编译器安装在哪个目录 返回: /usr/bin/gcc 注意:如果which没有找到要找的命令,可以 ...

  7. Linux的安装包命令/yum 与 Rpm

    1.Rpm安装包命令(以dhcp软件包为例)----Rpm安装软件包需要解决依赖性,因此特别麻烦(如图2被需要). rpm与yum安装的均为二进制软件包.类似于windows下载的软件包,可直接安装使 ...

  8. Linux中安装配置spark集群

    一. Spark简介 Spark是一个通用的并行计算框架,由UCBerkeley的AMP实验室开发.Spark基于map reduce 算法模式实现的分布式计算,拥有Hadoop MapReduce所 ...

  9. mysql官网下载linux版本安装包

    原文地址:点击打开链接 今天在Linux上部署项目,用到了Mysql,因此想要下载适用于Linux的安装版本,在Mysql官网找了半天,终于找到怎样下载了,这里写出来,以后大家找的时候就好找了. 第一 ...

随机推荐

  1. Git-Git里程碑

    里程碑即Tag,是人为对提交进行的命名.这和Git的提交ID是否太长无关,使用任何数字版本号无论长短,都没有使用一个直观的表意的字符串来得方便.例如:用里程碑名称"v2.1"对应于 ...

  2. 安测云验证有CTA问题

    背景: 现在所有的app 都需要通过工信部的审核.用户不同意之前,不能联网. 那么,我怎么知道自己的应用有没有联网呢?那么多sdk ,那么多代码?我怎么测试呢? 哈哈,我们测试给的方法真的很管用. l ...

  3. Installation error: INSTALL_FAILED_CANCELLED_BY_USER

    我的手机本来是支持Androidstadio 调试手机的,我手机小米的,后来,系统升级了,我也没在意,第二天上班,已运行就报错: Installation error: INSTALL_FAILED_ ...

  4. RelativeLayout 深入理解

    今天做app底部的导航栏,就是会有一个分割线,分割内容和下面的fragmenttablehost,那条线,我看开源中国是用relativelayout包裹的. 我也包裹,但是不行.显示不出来那条线. ...

  5. toolbar menu 字体颜色和大小

    Toolbar菜单中menu当中我们大多数都使用图片来按钮,可是有些时候我们也会直接使用文字,文字的颜色如何修改呢. 其实很简单,我们只要修改styles.xml文件中,添加一句 <item n ...

  6. easyui 获取树的平级根节点的父节点&选择性展示树的一个根节点

    1.easyui的树的根节点一般是几个平级的,怎样获取这些父节点的id? 可以将获取到的平级根节点放在一个数组中 var roots=[]; roots=$("#tree1").t ...

  7. web项目中获取spring的bean对象

    Spring是一个轻量级的控制反转(IoC)和面向切面(AOP)的容器框架,如何在程序中不通过注解的形式(@Resource.@Autowired)获取Spring配置的bean呢? Bean工厂(c ...

  8. 《Cracking the Coding Interview》——第12章:测试——题目2

    2014-04-24 23:15 题目:你有一段程序,运行了十次每次都在不同的地方崩掉了.已知这段程序只用了标准C或C++库函数,请问有什么思路来找出问题所在. 解法:1. 时间戳每次都不同.2. 随 ...

  9. 《Cracking the Coding Interview》——第4章:树和图——题目1

    2014-03-19 03:30 题目:判断一个二叉树是否为平衡二叉树,即左右子树高度相差不超过1. 解法:递归算高度并判断即可. 代码: // 4.1 Implement an algorithm ...

  10. Timer的schedule和scheduleAtFixedRate方法的区别解析

    在java中,Timer类主要用于定时性.周期性任务 的触发,这个类中有两个方法比较难理解,那就是schedule和scheduleAtFixedRate方法,在这里就用实例分析一下 (1)sched ...