Spark-submit脚本解读
#!/usr/bin/env bash #
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# # NOTE: Any changes in this file must be reflected in SparkSubmitDriverBootstrapper.scala! #Spark的安装目录
export SPARK_HOME="$(cd `dirname $0`/..; pwd)"
#将参数已数组的形式赋值给ORIG_ARGS
ORIG_ARGS=("$@") #根据不同的参数项,把对应的参数值赋给对应的环境变量
while (($#)); do
if [ "$1" = "--deploy-mode" ]; then
SPARK_SUBMIT_DEPLOY_MODE=$
elif [ "$1" = "--properties-file" ]; then
SPARK_SUBMIT_PROPERTIES_FILE=$
elif [ "$1" = "--driver-memory" ]; then
export SPARK_SUBMIT_DRIVER_MEMORY=$
elif [ "$1" = "--driver-library-path" ]; then
export SPARK_SUBMIT_LIBRARY_PATH=$
elif [ "$1" = "--driver-class-path" ]; then
export SPARK_SUBMIT_CLASSPATH=$
elif [ "$1" = "--driver-java-options" ]; then
export SPARK_SUBMIT_OPTS=$
fi
shift
done #定义一些默认的变量,会被用户的自定义参数覆盖
# :- 同 nvl
DEFAULT_PROPERTIES_FILE="$SPARK_HOME/conf/spark-defaults.conf"
export SPARK_SUBMIT_DEPLOY_MODE=${SPARK_SUBMIT_DEPLOY_MODE:-"client"}
export SPARK_SUBMIT_PROPERTIES_FILE=${SPARK_SUBMIT_PROPERTIES_FILE:-"$DEFAULT_PROPERTIES_FILE"} # For client mode, the driver will be launched in the same JVM that launches
# SparkSubmit, so we may need to read the properties file for any extra class
# paths, library paths, java options and memory early on. Otherwise, it will
# be too late by the time the driver JVM has started. #从spark-defaults.conf文件中获取"spark.driver.extra*\|spark.driver.memory" 两个变量的值 if [[ "$SPARK_SUBMIT_DEPLOY_MODE" == "client" && -f "$SPARK_SUBMIT_PROPERTIES_FILE" ]]; then
# Parse the properties file only if the special configs exist
contains_special_configs=$(
grep -e "spark.driver.extra*\|spark.driver.memory" "$SPARK_SUBMIT_PROPERTIES_FILE" | \
grep -v "^[[:space:]]*#"
)
if [ -n "$contains_special_configs" ]; then
export SPARK_SUBMIT_BOOTSTRAP_DRIVER=
fi
fi
#将参数传递spark-class
#exec命令在执行时会把当前的shell process关闭,然后换到后面的命令继续执行
exec $SPARK_HOME/bin/spark-class org.apache.spark.deploy.SparkSubmit "${ORIG_ARGS[@]}"
Spark-submit脚本解读的更多相关文章
- Spark Submit 脚本
当我们需要命令行传递参数时候,将--class 写在前面,然后是jar 最后是参数 spark-submit --master yarn --num-executors 3 --executor-me ...
- 【原创】大数据基础之Spark(1)Spark Submit即Spark任务提交过程
Spark2.1.1 一 Spark Submit本地解析 1.1 现象 提交命令: spark-submit --master local[10] --driver-memory 30g --cla ...
- Spark-class启动脚本解读
#!/usr/bin/env bash # # Licensed to the Apache Software Foundation (ASF) under one or more # contrib ...
- spark submit参数及调优(转载)
spark submit参数介绍 你可以通过spark-submit --help或者spark-shell --help来查看这些参数. 使用格式: ./bin/spark-submit \ -- ...
- spark submit local遇到路径hdfs的问题
有时候第一次执行 spark submit --master local[*] 单机模式的时候,可以对linux本地路径进行输出.但是有时候提交到yarn的时候,是自动加上hdfs的路径这没问题, 但 ...
- Spark 个人实战系列(2)--Spark 服务脚本分析
前言: spark最近非常的火热, 本文不讲spark原理, 而是研究spark集群搭建和服务的脚本是如何编写的, 管中窥豹, 希望从运行脚本的角度去理解spark集群. 研究的spark为1.0.1 ...
- spark相关脚本解析
spark-shell/spark-submit/pyspark等关系如下: #spark-submit 逻辑: ########################################### ...
- Spark-shell启动脚本解读
#!/usr/bin/env bash # # Licensed to the Apache Software Foundation (ASF) under one or more # contrib ...
- spark standalone ha spark submit
when you build a spark standalone ha cluster, when you submit your app, you should send it to the l ...
- Spark Shell & Spark submit
Spark 的 shell 是一个强大的交互式数据分析工具. 1. 搭建Spark 2. 两个目录下面有可执行文件: bin 包含spark-shell 和 spark-submit sbin 包含 ...
随机推荐
- KMS激活windows
slmgr /skms 106.0.6.209 slmgr /ato .查看当前系统是否永久激活,命令的作用是查看当前许可证状态的截止日期 slmgr.vbs -xpr 查看Windows正式版产品密 ...
- JS正则表达式 简单应用
知识点: 先生成一个正则规则的对象,使用test()对传入的字符串进行验证,返回布尔类型 代码: <!doctype html><html><head> <m ...
- c# json 反序列化 泛型List 2行代码
List<EncyTable> list = new List<EncyTable>(); var jsonReqeust = "[{ENCY_ID:775,ENCY ...
- RabbitMQ吞吐量测试-PerfTest上
RabbitMQ吞吐量测试-PerfTest上 PerfTest RabbitMQ有一个基本的吞吐量测试工具PerfTest(文档,源代码和版本),它基于Java客户端,可以配置为模拟基本工作负载.P ...
- 【bzoj2306】[Ctsc2011]幸福路径 倍增Floyd
题目描述 一张n个点的有向图,每个点有一个权值.一开始从点$v_0$出发沿图中的边任意移动,移动到路径上的第$i$个点 输入 每一行中两个数之间用一个空格隔开. 输入文件第一行包含两个正整数 n, ...
- BZOJ2599 [IOI2011]Race 【点分治】
题目 给一棵树,每条边有权.求一条简单路径,权值和等于K,且边的数量最小.N <= 200000, K <= 1000000 输入格式 第一行 两个整数 n, k 第二..n行 每行三个整 ...
- Educational Codeforces Round 42 (Rated for Div. 2) C
C. Make a Square time limit per test 2 seconds memory limit per test 256 megabytes input standard in ...
- java 复习整理(五 类加载机制与对象初始化)
类加载机制与对象初始化 一 . 类加载机制 类加载机制是指.class文件加载到jvm并形成Class对象的机制.之后应用可对Class对象进行实例化并调用.类加载机制可在运行时动态加载外部的类, ...
- java合并两个有序数组的算法(抛砖引玉)
前几天看见一道面试题中要将两个有序数组合并成一个新的有序数组,首先使用了嵌套循环,之后想那样效率太低,又想出了以下思路,和大家分享下,如果有更好的方法,请留言指教: 思路: 1.新建一个数组大小为fi ...
- python 使用urllib2下载文件
#! usr/bin/python #coding=utf-8 import urllib2 fp = open('test', 'wb') req = urllib2.urlopen('http:/ ...