Spark-shell启动脚本解读
#!/usr/bin/env bash #
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# #
# Shell script for starting the Spark Shell REPL #判断是否为cygwin
cygwin=false
case "`uname`" in
CYGWIN*) cygwin=true;;
esac # Enter posix mode for bash
set -o posix ## Global script variables #进入到spark的安装目录
FWDIR="$(cd `dirname $0`/..; pwd)" #定义帮助信息的方法
#调用spark-submit的帮助信息,只是把submit以下帮助信息过滤掉
# Usage: spark-submit [options] <app jar | python file> [app arguments]
# Usage: spark-submit --kill [submission ID] --master [spark://...]
# Usage: spark-submit --status [submission ID] --master [spark://...] function usage() {
echo "Usage: ./bin/spark-shell [options]"
$FWDIR/bin/spark-submit --help >& | grep -v Usage >&
exit
} if [[ "$@" = *--help ]] || [[ "$@" = *-h ]]; then
usage
fi #引用utils.sh脚本,脚本的功能为整理脚本参数、判断部分参数的合法性,给以下两个变量赋值
#SUBMISSION_OPTS:
#SUBMISSION_OPTS参数包括:
# K-V形式的有: --master | --deploy-mode | --class | --name | --jars | --py-files | --files | \
# --conf | --properties-file | --driver-memory | --driver-java-options | \
# --driver-library-path | --driver-class-path | --executor-memory | --driver-cores | \
# --total-executor-cores | --executor-cores | --queue | --num-executors | --archives
# 非K-V形式的有
# --verbose | -v | --supervise
# KV形式的需要对个数进行判断
#
#APPLICATION_OPTS参数包括除SUBMISSION_OPTS之外的参数
source $FWDIR/bin/utils.sh
#定义帮助信息方法的变量
SUBMIT_USAGE_FUNCTION=usage
#调用utils.sh脚本中的gatherSparkSubmitOpts方法。对参数进行整理
gatherSparkSubmitOpts "$@" #主函数,调用spark-submit --class org.apache.spark.repl.Main方法 function main() {
if $cygwin; then
# Workaround for issue involving JLine and Cygwin
# (see http://sourceforge.net/p/jline/bugs/40/).
# If you're using the Mintty terminal emulator in Cygwin, may need to set the
# "Backspace sends ^H" setting in "Keys" section of the Mintty options
# (see https://github.com/sbt/sbt/issues/562).
stty -icanon min -echo > /dev/null >&
export SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Djline.terminal=unix"
$FWDIR/bin/spark-submit --class org.apache.spark.repl.Main "${SUBMISSION_OPTS[@]}" spark-shell "${APPLICATION_OPTS[@]}"
stty icanon echo > /dev/null >&
else
export SPARK_SUBMIT_OPTS
$FWDIR/bin/spark-submit --class org.apache.spark.repl.Main "${SUBMISSION_OPTS[@]}" spark-shell "${APPLICATION_OPTS[@]}"
fi
} # Copy restore-TTY-on-exit functions from Scala script so spark-shell exits properly even in
# binary distribution of Spark where Scala is not installed
exit_status=
saved_stty="" # restore stty settings (echo in particular)
function restoreSttySettings() {
stty $saved_stty
saved_stty=""
} function onExit() {
if [[ "$saved_stty" != "" ]]; then
restoreSttySettings
fi
exit $exit_status
} # to reenable echo if we are interrupted before completing.
trap onExit INT # save terminal settings
saved_stty=$(stty -g >/dev/null)
# clear on error so we don't later try to restore them
if [[ ! $? ]]; then
saved_stty=""
fi main "$@" # record the exit status lest it be overwritten:
# then reenable echo and propagate the code.
exit_status=$?
onExit
utils.sh脚本内容:
#!/usr/bin/env bash #
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# # Gather all all spark-submit options into SUBMISSION_OPTS
function gatherSparkSubmitOpts() { if [ -z "$SUBMIT_USAGE_FUNCTION" ]; then
echo "Function for printing usage of $0 is not set." >&
echo "Please set usage function to shell variable 'SUBMIT_USAGE_FUNCTION' in $0" >&
exit
fi # NOTE: If you add or remove spark-sumbmit options,
# modify NOT ONLY this script but also SparkSubmitArgument.scala
SUBMISSION_OPTS=()
APPLICATION_OPTS=()
while (($#)); do
case "$1" in
--master | --deploy-mode | --class | --name | --jars | --py-files | --files | \
--conf | --properties-file | --driver-memory | --driver-java-options | \
--driver-library-path | --driver-class-path | --executor-memory | --driver-cores | \
--total-executor-cores | --executor-cores | --queue | --num-executors | --archives)
if [[ $# -lt ]]; then
"$SUBMIT_USAGE_FUNCTION"
exit ;
fi
SUBMISSION_OPTS+=("$1"); shift
SUBMISSION_OPTS+=("$1"); shift
;; --verbose | -v | --supervise)
SUBMISSION_OPTS+=("$1"); shift
;; *)
APPLICATION_OPTS+=("$1"); shift
;;
esac
done export SUBMISSION_OPTS
export APPLICATION_OPTS
}
Spark-shell启动脚本解读的更多相关文章
- Spark配置&启动脚本分析
本文档基于Spark2.0,对spark启动脚本进行分析. date:2016/8/3 author:wangxl Spark配置&启动脚本分析 我们主要关注3类文件,配置文件,启动脚本文件以 ...
- 一篇关于Maven项目的jar包Shell启动脚本
使用Maven作为项目jar包依赖的管理,常常会遇到命令行启动,笔者也是哥菜鸟,在做微服务,以及服务器端开发的过程中,常常会遇到项目的启动需要使用main方法,笔者潜心的研究了很多博客,发现大多写的都 ...
- shell 启动脚本
启动脚本是bash启动时自动执行的脚本.用户可以把一些环境变量的设置和alias.umask设置放在启动脚本中,这样每次启动Shell时这些设置都自动生效.思考一下,bash在执行启动脚本时是以for ...
- (转)mysql5.6.7多实例安装、配置的详细讲解分析及shell启动脚本的编写
一.mysql安装 1.下载mysql数据库源码包: wget http://cdn.mysql.com/Downloads/MySQL-5.6/mysql-5.6.27.tar.gz 2.安装mys ...
- Spark Shell启动时遇到<console>:14: error: not found: value spark import spark.implicits._ <console>:14: error: not found: value spark import spark.sql错误的解决办法(图文详解)
不多说,直接上干货! 最近,开始,进一步学习spark的最新版本.由原来经常使用的spark-1.6.1,现在来使用spark-2.2.0-bin-hadoop2.6.tgz. 前期博客 Spark ...
- Spark-class启动脚本解读
#!/usr/bin/env bash # # Licensed to the Apache Software Foundation (ASF) under one or more # contrib ...
- Linux 下 Redis 服务 Shell启动脚本
# chkconfig: 2345 10 90 # description: Start and Stop redis PATH=/usr/local/bin:/sbin:/usr/bin:/bin ...
- Spark学习之路 (十五)SparkCore的源码解读(一)启动脚本
一.启动脚本分析 独立部署模式下,主要由master和slaves组成,master可以利用zk实现高可用性,其driver,work,app等信息可以持久化到zk上:slaves由一台至多台主机构成 ...
- linux shell 之尝试编写 企业级 启动脚本
企业Shell面试题10:开发企业级MySQL启动脚本 说明: MySQL启动命令为: 1 /bin/sh mysqld_safe --pid-file=$mysqld_pid_file_path 2 ...
随机推荐
- 团队Alpha版本(七)冲刺
目录 组员情况 组员1(组长):胡绪佩 组员2:胡青元 组员3:庄卉 组员4:家灿 组员5:凯琳 组员6:翟丹丹 组员7:何家伟 组员8:政演 组员9:黄鸿杰 组员10:刘一好 组员11:何宇恒 展示 ...
- NYOJ36 水池数目
水池数目 时间限制:3000 ms | 内存限制:65535 KB 难度:4 描述 南阳理工学院校园里有一些小河和一些湖泊,现在,我们把它们通一看成水池,假设有一张我们学校的某处的地图,这个地 ...
- gulp-API介绍
使用gulp,一般只需要用4个API:gulp.src(),gulp.dest(),gulp.task(),gulp.watch(). 1. gulp.src() 用来获取流的,但是要注意的是这个流里 ...
- 洛谷 P2485 [SDOI2011]计算器 解题报告
P2485 [SDOI2011]计算器 题目描述 你被要求设计一个计算器完成以下三项任务: 1.给定y.z.p,计算y^z mod p 的值: 2.给定y.z.p,计算满足xy ≡z(mod p)的最 ...
- 【POJ 2406 Power Strings】
Time Limit: 3000MSMemory Limit: 65536K Description Given two strings a and b we define a*b to be the ...
- 乌班图 root权限获取
点击左侧终端标 2 出现命令提示符 3 首先输入:sudo passwd root(设置root密码) 4 输入当前系统的账户密码(账户:admin-pc的密码) 5 输入新的root密码,确认新密码 ...
- VS2013 MFC C++ CString ,const char , char, string 类型转换
VS2013 测试 以下测试加入头文件: # include <string>#include <cstdlib>using namespace std; //-------- ...
- windows下xampp安装PHP的pthreads多线程扩展
我的运行环境: 系统:windows10 ,64位 PHP:5.6.8 TS,VC11 ,32位 Apache: 2.0 我安装的是xampp集成环境 pthreads的windows扩展文件下载地址 ...
- 或许你不知道的10条SQL
一.一些常见的SQL实践 (1)负向条件查询不能使用索引 select * from order where status!=0 and stauts!=1 not in/not exists都不是好 ...
- Selenium2+python自动化4-Pycharm使用【转载】
前言 在写脚本之前,先要找个顺手的写脚本工具.python是一门解释性编程语言,所以一般把写python的工具叫解释器.写python脚本的工具很多,小编这里就不一一列举的,只要自己用着顺手就可以的, ...