Spark(2) - Developing Application with Spark
Exploring the Spark shell
Spark comes bundled with a PERL shell, which is a wrapper around the Scala shell. Though the Spark shell looks lime a command line for simple things, in reality a lot of complex queries can also be executed using it.
1. create the words directory
mkdir words
2. go into the words directory
cd words
3. create a sh.txt file
echo "to be or not to be" > sh.txt
4. start the Spark shell
spark-shell
5. load the words directory as RDD(Resilient Distributed Dataset)
Scala> val words = sc.textFile("hdfs://localhost:9000/user/hduser/words")
6. count the number of lines(result: 1)
Scala> words.count
7. divide the line (or lines) into multiple words
Scala> val wordsFlatMap = words.flatmap(_.split("\\W+"))
8. convert word to (word, 1)
Scala> val wordsMap = wordsFlatMap.map(w => (w, 1))
9. add the number of occurrences for each word
Scala> val wordCount = wordsMap.reduceByKey((a, b) => (a + b))
10. sort the results
Scala> val wordCountSorted = wordCount.sortByKey(true)
11. print the RDD
Scala> wordCountSorted.collect.foreach(println)
12. doing all operations in one step
Scala> sc.textFile("hdfs://localhost:9000/user/hduser/words").flatMap(_.split("\\W+")).map(w => (w,1)).reduceByKey((a,b) => (a+b)).sortByKey(true).collect.foreach(println)
This gives us the following output:
(or,1)
(to,2)
(not,1)
(be,2)
Developing Spark applications in Eclipse with Maven
Maven has two primary features:
1. Convention over configuration
/src/main/scala
/src/main/java
/src/main/resources
/src/test/scala
/src/test/java
/src/test/resources
2. Declarative dependency management
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
</dependency>
Install Maven plugin for Eclipse:
1. Open Eclipse and navigate to Help | Install New Software
2. Click on the Work with drop-down menu
3. Select the <eclipse version> update site
4. Click on Collaboration tools
5. Check Maven's integration with Eclipse
6. Click on Next and then click on Finish
Install the Scala plugin for Eclipse:
1. Open Eclipse and navigate to Help | Install New Software
2. Click on the Work with drop-down menu
3. Select the <eclipse version> update site
4. Type http://download.scala-ide.org/sdk/helium/e38/scala210/stable/site
5. Press Enter
6. Select Scala IDE for Eclipse
7. Click on Next and then click on Finish
8. Navigate to Window | Open Perspective | Scala
Developing Spark applications in Eclipse with SBT
Simple Build Tool(SBT) is a build tool made especially for Scala-based development. SBT follows Maven-based naming conventions and declarative dependency management.
SBT provides the following enchancements over Maven:
1. Dependencies are in the form of key-value pairs in the build.sbt file as opposed to pom.xml in Maven
2. It provides a shell that makes it very handy to perform build operations
3. For simple projects without dependencies, you do not even need the build.sbt file
In build.sbt, the first line is the project definition:
lazy val root = (project in file("."))
Each project has an immutable map of key-value pairs.
lazy val root = (project in file("."))
settings(
name := "wordcount"
)
Every change in the settings leads to a new map, as it's an immutable map
1. add to the global plugin file
mkdir /home/hduser/.sbt/0.13/plugins
echo addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.5.0") > /home/hduser/.sbt/0.13/plugins/plugin.sbt
or add to specific project
cd <project-home>
echo addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.5.0") > plugin.sbt
2. start the sbt shell
sbt
3. type eclipse and it will make an Eclipse-ready project
eclipse
4. navigate to File | Import | Import existing project into workspace to load the project into Eclipse
Spark(2) - Developing Application with Spark的更多相关文章
- (一)Spark简介-Java&Python版Spark
Spark简介 视频教程: 1.优酷 2.YouTube 简介: Spark是加州大学伯克利分校AMP实验室,开发的通用内存并行计算框架.Spark在2013年6月进入Apache成为孵化项目,8个月 ...
- Spark学习(四) -- Spark作业提交
标签(空格分隔): Spark 作业提交 先回顾一下WordCount的过程: sc.textFile("README.rd").flatMap(line => line.s ...
- Spark入门实战系列--1.Spark及其生态圈简介
[注]该系列文章以及使用到安装包/测试数据 可以在<倾情大奉送--Spark入门实战系列>获取 .简介 1.1 Spark简介 年6月进入Apache成为孵化项目,8个月后成为Apache ...
- Spark入门实战系列--3.Spark编程模型(上)--编程模型及SparkShell实战
[注]该系列文章以及使用到安装包/测试数据 可以在<倾情大奉送--Spark入门实战系列>获取 .Spark编程模型 1.1 术语定义 l应用程序(Application): 基于Spar ...
- Spark入门实战系列--4.Spark运行架构
[注]该系列文章以及使用到安装包/测试数据 可以在<倾情大奉送--Spark入门实战系列>获取 1. Spark运行架构 1.1 术语定义 lApplication:Spark Appli ...
- Spark中文指南(入门篇)-Spark编程模型(一)
前言 本章将对Spark做一个简单的介绍,更多教程请参考:Spark教程 本章知识点概括 Apache Spark简介 Spark的四种运行模式 Spark基于Standlone的运行流程 Spark ...
- Spark On Yarn:提交Spark应用程序到Yarn
转载自:http://lxw1234.com/archives/2015/07/416.htm 关键字:Spark On Yarn.Spark Yarn Cluster.Spark Yarn Clie ...
- 大数据技术之_19_Spark学习_01_Spark 基础解析 + Spark 概述 + Spark 集群安装 + 执行 Spark 程序
第1章 Spark 概述1.1 什么是 Spark1.2 Spark 特点1.3 Spark 的用户和用途第2章 Spark 集群安装2.1 集群角色2.2 机器准备2.3 下载 Spark 安装包2 ...
- 【Spark深入学习 -14】Spark应用经验与程序调优
----本节内容------- 1.遗留问题解答 2.Spark调优初体验 2.1 利用WebUI分析程序瓶颈 2.2 设置合适的资源 2.3 调整任务的并发度 2.4 修改存储格式 3.Spark调 ...
随机推荐
- 2016.3.22考试(HNOI难度)
T1 盾盾的打字机 盾盾有一个非常有意思的打字机,现在盾哥要用这台打字机来打出一段文章. 由于有了上次的经验,盾盾预先准备好了一段模板A存在了内存中,并以此为基础来打出文章B.盾盾每次操作可以将内存中 ...
- mysql 2003 10038 连接不上的解决
网上写的很复杂,其实解决办法是 你在右键管理员权限下运行 mysqld. 忘记这个了,害的又找了近1个小时的时间找问题.
- maven概念
1. 下载并解压Maven:Maven下载页2. 将环境变量M2_HOME设置为解压后的目录: 3. 将M2环境变量设置为M2_HOME/bin(在Windows上是%M2_HOME%/bin,在U ...
- [转载] 深入理解Linux修改hostname
原文: http://www.cnblogs.com/kerrycode/p/3595724.html 当我觉得对Linux系统下修改hostname已经非常熟悉的时候,今天碰到了几个个问题,这几个问 ...
- maven之详解继承与聚合
说到聚合与继承我们都很熟悉,maven同样也具备这样的设计原则,下面我们来看一下Maven的pom如何进行聚合与继承的配置实现. 一.为什么要聚合? 随着技术的飞速发展和各类用户对软件的要求越来越高, ...
- .net 连接sqlserver类库
using System; using System.Collections.Generic; using System.Linq; using System.Web; using System.Da ...
- Nginx反向代理负载均衡
环境准备: 总共四台机器,两台装有Nginx的机器做负载均衡,两台机器装有Apache作为WEB服务器. 机器信息 hostname IP 说明 lb01 192.168.1.19 nginx主负载均 ...
- openSUSE 国内镜像和镜像使用帮助 (zhuan)
https://my.oschina.net/u/2475751/blog/631036?p={{currentPage-1}} https://lug.ustc.edu.cn/wiki/mirror ...
- 多路径(multi-path)安装测试实例
1.确保安装以下的包: device-mapper device-mapper-multipath [root@nticket1~]# rpm -qa "*device*" dev ...
- js 中文乱码
js合成url时,如果参数是中文,传到struts2中会乱码,解决办法如下:1.js文件中使用encodeURI()方法(必须套两层).login_name = encodeURI(encodeURI ...