Linux单机安转Spark
1. 创建目录
> mkdir /opt/spark
> cd /opt/spark
2. 解压缩、创建软连接
> tar zxvf spark-2.3.0-bin-hadoop2.7.tgz
> link -s spark-2.3.0-bin-hadoop2.7 spark
4. 编辑 /etc/profile
> vi /etc/profile
输入下面内容
export SPARK_HOME=/opt/spark/spark
export PATH=$PATH:$SPARK_HOME/bin
> source /etc/profile
5. 进入配置文件夹
> cd /opt/spark/spark/conf
6. 配置spark-env.sh
> cp spark-env.sh.template spark-env.sh
spark-env.sh 中输入以下内容
export SCALA_HOME=/opt/scala/scala
export JAVA_HOME=/opt/java/jdk
export SPARK_HOME=/opt/spark/spark
export SPARK_MASTER_IP=hserver1
export SPARK_EXECUTOR_MEMORY=1G
注意:上面的路径应该根据自己的路径配置
7. 配置slaves
> cp slaves.template slaves
slaves 中输入以下内容
localhost
8. 运行spark示例
> cd /opt/spark/spark
> ./bin/run-example SparkPi 10
会显示下面信息
[aston@localhost spark]$ ./bin/run-example SparkPi 10
2018-06-04 22:37:25 WARN Utils:66 - Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 192.168.199.150 instead (on interface wlp8s0b1)
2018-06-04 22:37:25 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2018-06-04 22:37:25 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-06-04 22:37:25 INFO SparkContext:54 - Running Spark version 2.3.0
2018-06-04 22:37:25 INFO SparkContext:54 - Submitted application: Spark Pi
2018-06-04 22:37:26 INFO SecurityManager:54 - Changing view acls to: aston
2018-06-04 22:37:26 INFO SecurityManager:54 - Changing modify acls to: aston
2018-06-04 22:37:26 INFO SecurityManager:54 - Changing view acls groups to:
2018-06-04 22:37:26 INFO SecurityManager:54 - Changing modify acls groups to:
2018-06-04 22:37:26 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(aston); groups with view permissions: Set(); users with modify permissions: Set(aston); groups with modify permissions: Set()
2018-06-04 22:37:26 INFO Utils:54 - Successfully started service 'sparkDriver' on port 34729.
2018-06-04 22:37:26 INFO SparkEnv:54 - Registering MapOutputTracker
2018-06-04 22:37:26 INFO SparkEnv:54 - Registering BlockManagerMaster
2018-06-04 22:37:26 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2018-06-04 22:37:26 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2018-06-04 22:37:26 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-4d51d515-85db-4a8c-bb45-219fd96be3c6
2018-06-04 22:37:26 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB
2018-06-04 22:37:26 INFO SparkEnv:54 - Registering OutputCommitCoordinator
2018-06-04 22:37:26 INFO log:192 - Logging initialized @2296ms
2018-06-04 22:37:26 INFO Server:346 - jetty-9.3.z-SNAPSHOT
2018-06-04 22:37:26 INFO Server:414 - Started @2382ms
2018-06-04 22:37:26 INFO AbstractConnector:278 - Started ServerConnector@779dfe55{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-06-04 22:37:26 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040.
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5f212d84{/jobs,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@27ead29e{/jobs/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4c060c8f{/jobs/job,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@383f3558{/jobs/job/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@49b07ee3{/stages,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@352e612e{/stages/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@65f00478{/stages/stage,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@28486680{/stages/stage/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4d7e7435{/stages/pool,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4a1e3ac1{/stages/pool/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6e78fcf5{/storage,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@56febdc{/storage/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3b8ee898{/storage/rdd,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7d151a{/storage/rdd/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@294bdeb4{/environment,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5300f14a{/environment/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1f86099a{/executors,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@77bb0ab5{/executors/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@f2c488{/executors/threadDump,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@54acff7d{/executors/threadDump/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7bc9e6ab{/static,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@37d00a23{/,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@433e536f{/api,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@988246e{/jobs/job/kill,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@62515a47{/stages/stage/kill,null,AVAILABLE,@Spark}
2018-06-04 22:37:26 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://192.168.199.150:4040
2018-06-04 22:37:26 INFO SparkContext:54 - Added JAR file:///opt/spark/spark/examples/jars/spark-examples_2.11-2.3.0.jar at spark://192.168.199.150:34729/jars/spark-examples_2.11-2.3.0.jar with timestamp 1528123046779
2018-06-04 22:37:26 INFO SparkContext:54 - Added JAR file:///opt/spark/spark/examples/jars/scopt_2.11-3.7.0.jar at spark://192.168.199.150:34729/jars/scopt_2.11-3.7.0.jar with timestamp 1528123046780
2018-06-04 22:37:26 INFO Executor:54 - Starting executor ID driver on host localhost
2018-06-04 22:37:26 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 45436.
2018-06-04 22:37:26 INFO NettyBlockTransferService:54 - Server created on 192.168.199.150:45436
2018-06-04 22:37:26 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2018-06-04 22:37:26 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, 192.168.199.150, 45436, None)
2018-06-04 22:37:26 INFO BlockManagerMasterEndpoint:54 - Registering block manager 192.168.199.150:45436 with 366.3 MB RAM, BlockManagerId(driver, 192.168.199.150, 45436, None)
2018-06-04 22:37:26 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, 192.168.199.150, 45436, None)
2018-06-04 22:37:26 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, 192.168.199.150, 45436, None)
2018-06-04 22:37:27 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@65bcf7c2{/metrics/json,null,AVAILABLE,@Spark}
2018-06-04 22:37:27 INFO SparkContext:54 - Starting job: reduce at SparkPi.scala:38
2018-06-04 22:37:27 INFO DAGScheduler:54 - Got job 0 (reduce at SparkPi.scala:38) with 10 output partitions
2018-06-04 22:37:27 INFO DAGScheduler:54 - Final stage: ResultStage 0 (reduce at SparkPi.scala:38)
2018-06-04 22:37:27 INFO DAGScheduler:54 - Parents of final stage: List()
2018-06-04 22:37:27 INFO DAGScheduler:54 - Missing parents: List()
2018-06-04 22:37:27 INFO DAGScheduler:54 - Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34), which has no missing parents
2018-06-04 22:37:27 INFO MemoryStore:54 - Block broadcast_0 stored as values in memory (estimated size 1832.0 B, free 366.3 MB)
2018-06-04 22:37:28 INFO MemoryStore:54 - Block broadcast_0_piece0 stored as bytes in memory (estimated size 1181.0 B, free 366.3 MB)
2018-06-04 22:37:28 INFO BlockManagerInfo:54 - Added broadcast_0_piece0 in memory on 192.168.199.150:45436 (size: 1181.0 B, free: 366.3 MB)
2018-06-04 22:37:28 INFO SparkContext:54 - Created broadcast 0 from broadcast at DAGScheduler.scala:1039
2018-06-04 22:37:28 INFO DAGScheduler:54 - Submitting 10 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))
2018-06-04 22:37:28 INFO TaskSchedulerImpl:54 - Adding task set 0.0 with 10 tasks
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 1.0 in stage 0.0 (TID 1, localhost, executor driver, partition 1, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 2.0 in stage 0.0 (TID 2, localhost, executor driver, partition 2, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 3.0 in stage 0.0 (TID 3, localhost, executor driver, partition 3, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO Executor:54 - Running task 2.0 in stage 0.0 (TID 2)
2018-06-04 22:37:28 INFO Executor:54 - Running task 1.0 in stage 0.0 (TID 1)
2018-06-04 22:37:28 INFO Executor:54 - Running task 3.0 in stage 0.0 (TID 3)
2018-06-04 22:37:28 INFO Executor:54 - Running task 0.0 in stage 0.0 (TID 0)
2018-06-04 22:37:28 INFO Executor:54 - Fetching spark://192.168.199.150:34729/jars/scopt_2.11-3.7.0.jar with timestamp 1528123046780
2018-06-04 22:37:28 INFO TransportClientFactory:267 - Successfully created connection to /192.168.199.150:34729 after 34 ms (0 ms spent in bootstraps)
2018-06-04 22:37:28 INFO Utils:54 - Fetching spark://192.168.199.150:34729/jars/scopt_2.11-3.7.0.jar to /tmp/spark-a840c54e-7db9-4dfc-a446-1fa10a8d2c3e/userFiles-36ae13de-60e8-42fd-958d-66c3c3832d4a/fetchFileTemp8606784681518533462.tmp
2018-06-04 22:37:28 INFO Executor:54 - Adding file:/tmp/spark-a840c54e-7db9-4dfc-a446-1fa10a8d2c3e/userFiles-36ae13de-60e8-42fd-958d-66c3c3832d4a/scopt_2.11-3.7.0.jar to class loader
2018-06-04 22:37:28 INFO Executor:54 - Fetching spark://192.168.199.150:34729/jars/spark-examples_2.11-2.3.0.jar with timestamp 1528123046779
2018-06-04 22:37:28 INFO Utils:54 - Fetching spark://192.168.199.150:34729/jars/spark-examples_2.11-2.3.0.jar to /tmp/spark-a840c54e-7db9-4dfc-a446-1fa10a8d2c3e/userFiles-36ae13de-60e8-42fd-958d-66c3c3832d4a/fetchFileTemp8435156876449095794.tmp
2018-06-04 22:37:28 INFO Executor:54 - Adding file:/tmp/spark-a840c54e-7db9-4dfc-a446-1fa10a8d2c3e/userFiles-36ae13de-60e8-42fd-958d-66c3c3832d4a/spark-examples_2.11-2.3.0.jar to class loader
2018-06-04 22:37:28 INFO Executor:54 - Finished task 0.0 in stage 0.0 (TID 0). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO Executor:54 - Finished task 1.0 in stage 0.0 (TID 1). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 4.0 in stage 0.0 (TID 4, localhost, executor driver, partition 4, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO Executor:54 - Finished task 2.0 in stage 0.0 (TID 2). 867 bytes result sent to driver
2018-06-04 22:37:28 INFO Executor:54 - Running task 4.0 in stage 0.0 (TID 4)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 5.0 in stage 0.0 (TID 5, localhost, executor driver, partition 5, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO Executor:54 - Running task 5.0 in stage 0.0 (TID 5)
2018-06-04 22:37:28 INFO Executor:54 - Finished task 3.0 in stage 0.0 (TID 3). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 6.0 in stage 0.0 (TID 6, localhost, executor driver, partition 6, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 7.0 in stage 0.0 (TID 7, localhost, executor driver, partition 7, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO Executor:54 - Running task 7.0 in stage 0.0 (TID 7)
2018-06-04 22:37:28 INFO Executor:54 - Running task 6.0 in stage 0.0 (TID 6)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 1.0 in stage 0.0 (TID 1) in 362 ms on localhost (executor driver) (1/10)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 3.0 in stage 0.0 (TID 3) in 385 ms on localhost (executor driver) (2/10)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 0.0 in stage 0.0 (TID 0) in 418 ms on localhost (executor driver) (3/10)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 2.0 in stage 0.0 (TID 2) in 388 ms on localhost (executor driver) (4/10)
2018-06-04 22:37:28 INFO Executor:54 - Finished task 5.0 in stage 0.0 (TID 5). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 8.0 in stage 0.0 (TID 8, localhost, executor driver, partition 8, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 5.0 in stage 0.0 (TID 5) in 79 ms on localhost (executor driver) (5/10)
2018-06-04 22:37:28 INFO Executor:54 - Running task 8.0 in stage 0.0 (TID 8)
2018-06-04 22:37:28 INFO Executor:54 - Finished task 4.0 in stage 0.0 (TID 4). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO TaskSetManager:54 - Starting task 9.0 in stage 0.0 (TID 9, localhost, executor driver, partition 9, PROCESS_LOCAL, 7853 bytes)
2018-06-04 22:37:28 INFO Executor:54 - Running task 9.0 in stage 0.0 (TID 9)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 4.0 in stage 0.0 (TID 4) in 99 ms on localhost (executor driver) (6/10)
2018-06-04 22:37:28 INFO Executor:54 - Finished task 7.0 in stage 0.0 (TID 7). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 7.0 in stage 0.0 (TID 7) in 98 ms on localhost (executor driver) (7/10)
2018-06-04 22:37:28 INFO Executor:54 - Finished task 6.0 in stage 0.0 (TID 6). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 6.0 in stage 0.0 (TID 6) in 107 ms on localhost (executor driver) (8/10)
2018-06-04 22:37:28 INFO Executor:54 - Finished task 9.0 in stage 0.0 (TID 9). 824 bytes result sent to driver
2018-06-04 22:37:28 INFO Executor:54 - Finished task 8.0 in stage 0.0 (TID 8). 867 bytes result sent to driver
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 9.0 in stage 0.0 (TID 9) in 39 ms on localhost (executor driver) (9/10)
2018-06-04 22:37:28 INFO TaskSetManager:54 - Finished task 8.0 in stage 0.0 (TID 8) in 57 ms on localhost (executor driver) (10/10)
2018-06-04 22:37:28 INFO TaskSchedulerImpl:54 - Removed TaskSet 0.0, whose tasks have all completed, from pool
2018-06-04 22:37:28 INFO DAGScheduler:54 - ResultStage 0 (reduce at SparkPi.scala:38) finished in 0.800 s
2018-06-04 22:37:28 INFO DAGScheduler:54 - Job 0 finished: reduce at SparkPi.scala:38, took 0.945853 s
Pi is roughly 3.14023914023914
2018-06-04 22:37:28 INFO AbstractConnector:318 - Stopped Spark@779dfe55{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-06-04 22:37:28 INFO SparkUI:54 - Stopped Spark web UI at http://192.168.199.150:4040
2018-06-04 22:37:28 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2018-06-04 22:37:28 INFO MemoryStore:54 - MemoryStore cleared
2018-06-04 22:37:28 INFO BlockManager:54 - BlockManager stopped
2018-06-04 22:37:28 INFO BlockManagerMaster:54 - BlockManagerMaster stopped
2018-06-04 22:37:28 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2018-06-04 22:37:28 INFO SparkContext:54 - Successfully stopped SparkContext
2018-06-04 22:37:28 INFO ShutdownHookManager:54 - Shutdown hook called
2018-06-04 22:37:28 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-a840c54e-7db9-4dfc-a446-1fa10a8d2c3e
2018-06-04 22:37:28 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-16300765-9872-4542-91ed-1a7a0f8285d9
9. 运行spark shell
> cd /opt/spark/spark
> ./bin/spark-shell
Linux单机安转Spark的更多相关文章
- 制作linux内核安装包
实验基于Centos 6.2 升级linux内核 直接在一个有编译环境的设备上,编译升级内核很简单. make menuconfig 或者 拷贝现有系统的.config文件 修改.config文件 ...
- linux多种安装包格式的安装方法
linux多种安装包格式的安装方法 一.rpm包安装方式步骤:1.找到相应的软件包,比如soft.version.rpm,下载到本机某个目录: 2.打开一个终端,su -成root用户: 3.cd s ...
- 寻找Linux单机负载瓶颈
寻找Linux单机负载瓶颈 服务器性能上不去,是哪里出了问题?IO还是CPU?只有找到瓶颈点,才能对症下药: 如何寻找Linux单机负载瓶颈,遵循的原则是不要推测,我们要通过测量的数据说话: 负载分两 ...
- Linux单机环境下HDFS伪分布式集群安装操作步骤v1.0
公司平台的分布式文件系统基于Hadoop HDFS技术构建,为开发人员学习及后续项目中Hadoop HDFS相关操作提供技术参考特编写此文档.本文档描述了Linux单机环境下Hadoop HDFS伪分 ...
- Linux下安裝Oracle database內核參數設置
參考:1529864.1 ************************************************** RAM ...
- Linux 查找安装包所在目录的常用方法
1. which命令查找出相关命令是否已经在搜索路径中: $which gcc //显示出GNC的C编译器安装在哪个目录 返回: /usr/bin/gcc 注意:如果which没有找到要找的命令,可以 ...
- Linux的安装包命令/yum 与 Rpm
1.Rpm安装包命令(以dhcp软件包为例)----Rpm安装软件包需要解决依赖性,因此特别麻烦(如图2被需要). rpm与yum安装的均为二进制软件包.类似于windows下载的软件包,可直接安装使 ...
- Linux中安装配置spark集群
一. Spark简介 Spark是一个通用的并行计算框架,由UCBerkeley的AMP实验室开发.Spark基于map reduce 算法模式实现的分布式计算,拥有Hadoop MapReduce所 ...
- mysql官网下载linux版本安装包
原文地址:点击打开链接 今天在Linux上部署项目,用到了Mysql,因此想要下载适用于Linux的安装版本,在Mysql官网找了半天,终于找到怎样下载了,这里写出来,以后大家找的时候就好找了. 第一 ...
随机推荐
- node-sass安装不成功的问题
SASS_BINARY_SITE=https://npm.taobao.org/mirrors/node-sass/ npm install node-sass 简单粗暴的执行上述的命令.
- 3226: [Sdoi2008]校门外的区间
链接 思路 bug漫天飞... 维护一颗线段树,支持区间赋值,和区间异或.因为会处理到一些方括号还是圆括号的问题,所以对于每一个下标都乘2,假设中间有一个.5即可,都变成了方括号,输出在处理一下. U ...
- nodejs安装&bower 安装
1.进入官网下载:https://nodejs.org/en/ 2.直接进行安装,可以将安装路径设置为:D:\nodejs 3.进入node.js command prompt 命令窗口 4.检测是否 ...
- ElasticSearch学习笔记(二)-- mapping
1. mapping简介 2. 自定义 mapping 3. mapping演示 创建索引,设置mapping,可以新增字段 get一下mapping 设置索引的字段不可新增 为索引添加字段,发现报错 ...
- Tomcat详解及SNS系统的部署实现
Tomcat详解及SNS系统的部署实现 http://jungege.blog.51cto.com/4102814/1409290
- Eclipse 工作空间(Workspace)---Eclipse教程第07课
Eclipse 工作空间(Workspace) eclipse 工作空间包含以下资源: 项目 文件 文件夹 项目启动时一般可以设置工作空间,你可以将其设置为默认工作空间,下次启动后无需再配置: 工作空 ...
- Django 2.0官方文档中文 总索引
Django 2.0官方文档中文 渣翻 总索引 翻译 2017年12月08日 11:19:1 官方原文: https://docs.djangoproject.com/en/2.0/ 当前翻译版本: ...
- 【Text Justification】cpp
题目: Given an array of words and a length L, format the text such that each line has exactly L charac ...
- SQL Server 分组取 Top 笔记(row_number + over 实现)
先看SQL语句(注意:这是在SQL Server 2005+ [包括2005] 的版本才支持的哦,o(∩_∩)o 哈哈~) SELECT col1,col2,col3 FROM table1 AS a ...
- selenium启动IE浏览器报错:selenium.common.exceptions.WebDriverException: Message: Unexpected error launching Internet Explorer. Protected Mode settings are not the same for all zones. Enable Protected Mode mu
意思是浏览器的保护模式设置不一致所导致 解决方案-->修改IE设置 将所有区域的保护模式勾选去掉即可