Exploring the Spark shell

Spark comes bundled with a PERL shell, which is a wrapper around the Scala shell. Though the Spark shell looks lime a command line for simple things, in reality a lot of complex queries can also be executed using it.

1. create the words directory

mkdir words

2. go into the words directory

cd words

3. create a sh.txt file

echo "to be or not to be" > sh.txt

4. start the Spark shell

spark-shell

5. load the words directory as RDD(Resilient Distributed Dataset)

Scala> val words = sc.textFile("hdfs://localhost:9000/user/hduser/words")

6. count the number of lines(result: 1)

Scala> words.count

7. divide the line (or lines) into multiple words

Scala> val wordsFlatMap = words.flatmap(_.split("\\W+"))

8. convert word to (word, 1)

Scala> val wordsMap = wordsFlatMap.map(w => (w, 1))

9. add the number of occurrences for each word

Scala> val wordCount = wordsMap.reduceByKey((a, b) => (a + b))

10. sort the results

Scala> val wordCountSorted = wordCount.sortByKey(true)

11. print the RDD

Scala> wordCountSorted.collect.foreach(println)

12. doing all operations in one step

Scala> sc.textFile("hdfs://localhost:9000/user/hduser/words").flatMap(_.split("\\W+")).map(w => (w,1)).reduceByKey((a,b) => (a+b)).sortByKey(true).collect.foreach(println)

This gives us the following output:
(or,1)
(to,2)
(not,1)
(be,2)

Developing Spark applications in Eclipse with Maven

Maven has two primary features:

1. Convention over configuration

/src/main/scala
/src/main/java
/src/main/resources
/src/test/scala
/src/test/java
/src/test/resources

2. Declarative dependency management

<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
</dependency>

Install Maven plugin for Eclipse:

1. Open Eclipse and navigate to Help | Install New Software

2. Click on the Work with drop-down menu

3. Select the <eclipse version> update site

4. Click on Collaboration tools

5. Check Maven's integration with Eclipse

6. Click on Next and then click on Finish

Install the Scala plugin for Eclipse:

1. Open Eclipse and navigate to Help | Install New Software

2. Click on the Work with drop-down menu

3. Select the <eclipse version> update site

4. Type http://download.scala-ide.org/sdk/helium/e38/scala210/stable/site

5. Press Enter

6. Select Scala IDE for Eclipse

7. Click on Next and then click on Finish

8. Navigate to Window | Open Perspective | Scala

Developing Spark applications in Eclipse with SBT

Simple Build Tool(SBT) is a build tool made especially for Scala-based development. SBT follows Maven-based naming conventions and declarative dependency management.

SBT provides the following enchancements over Maven:

1. Dependencies are in the form of key-value pairs in the build.sbt file as opposed to pom.xml in Maven

2. It provides a shell that makes it very handy to perform build operations

3. For simple projects without dependencies, you do not even need the build.sbt file

In build.sbt, the first line is the project definition:

lazy val root = (project in file("."))

Each project has an immutable map of key-value pairs.

lazy val root = (project in file("."))
settings(
name := "wordcount"
)

Every change in the settings leads to a new map, as it's an immutable map

1. add to the global plugin file

mkdir /home/hduser/.sbt/0.13/plugins
echo addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.5.0") > /home/hduser/.sbt/0.13/plugins/plugin.sbt

or add to specific project

cd <project-home>
echo addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.5.0") > plugin.sbt

2. start the sbt shell

sbt

3. type eclipse and it will make an Eclipse-ready project

eclipse

4. navigate to File | Import | Import existing project into workspace to load the project into Eclipse

Spark(2) - Developing Application with Spark的更多相关文章

  1. (一)Spark简介-Java&Python版Spark

    Spark简介 视频教程: 1.优酷 2.YouTube 简介: Spark是加州大学伯克利分校AMP实验室,开发的通用内存并行计算框架.Spark在2013年6月进入Apache成为孵化项目,8个月 ...

  2. Spark学习(四) -- Spark作业提交

    标签(空格分隔): Spark 作业提交 先回顾一下WordCount的过程: sc.textFile("README.rd").flatMap(line => line.s ...

  3. Spark入门实战系列--1.Spark及其生态圈简介

    [注]该系列文章以及使用到安装包/测试数据 可以在<倾情大奉送--Spark入门实战系列>获取 .简介 1.1 Spark简介 年6月进入Apache成为孵化项目,8个月后成为Apache ...

  4. Spark入门实战系列--3.Spark编程模型(上)--编程模型及SparkShell实战

    [注]该系列文章以及使用到安装包/测试数据 可以在<倾情大奉送--Spark入门实战系列>获取 .Spark编程模型 1.1 术语定义 l应用程序(Application): 基于Spar ...

  5. Spark入门实战系列--4.Spark运行架构

    [注]该系列文章以及使用到安装包/测试数据 可以在<倾情大奉送--Spark入门实战系列>获取 1. Spark运行架构 1.1 术语定义 lApplication:Spark Appli ...

  6. Spark中文指南(入门篇)-Spark编程模型(一)

    前言 本章将对Spark做一个简单的介绍,更多教程请参考:Spark教程 本章知识点概括 Apache Spark简介 Spark的四种运行模式 Spark基于Standlone的运行流程 Spark ...

  7. Spark On Yarn:提交Spark应用程序到Yarn

    转载自:http://lxw1234.com/archives/2015/07/416.htm 关键字:Spark On Yarn.Spark Yarn Cluster.Spark Yarn Clie ...

  8. 大数据技术之_19_Spark学习_01_Spark 基础解析 + Spark 概述 + Spark 集群安装 + 执行 Spark 程序

    第1章 Spark 概述1.1 什么是 Spark1.2 Spark 特点1.3 Spark 的用户和用途第2章 Spark 集群安装2.1 集群角色2.2 机器准备2.3 下载 Spark 安装包2 ...

  9. 【Spark深入学习 -14】Spark应用经验与程序调优

    ----本节内容------- 1.遗留问题解答 2.Spark调优初体验 2.1 利用WebUI分析程序瓶颈 2.2 设置合适的资源 2.3 调整任务的并发度 2.4 修改存储格式 3.Spark调 ...

随机推荐

  1. Xcode:Foundation框架找不到,或者是自动提示出现问题

    问题描述:Foundation框架找不到,或者是自动提示出现问题 之前的操作:手贱,不少心把编译器里面的源码改了 处理办法:清理缓存 缓存位置:点击桌面后,选择系统菜单栏:前往—电脑—硬盘—用户—ap ...

  2. Codeforces Round #279 (Div. 2) C. Hacking Cypher 前缀+后缀

    C. Hacking Cypher time limit per test 1 second memory limit per test 256 megabytes input standard in ...

  3. Redis脚本插件之————执行Lua脚本示例

    Redis在2.6推出了脚本功能,允许开发者使用Lua语言编写脚本传到Redis中执行.使用脚本的好处如下: 1.减少网络开销:本来5次网络请求的操作,可以用一个请求完成,原先5次请求的逻辑放在red ...

  4. SAP屠夫---折旧在13-16调整期间的烦恼(转)

    "应尽量避免在13-16期的折旧行为",在去年新准则ERP调整时就强调过,实际上, 有的企业并不使用13-16期间, 假设某家企业将折旧折在13期, 非常可惜的是,sap的折旧费用 ...

  5. [转]ubuntu下安装程序的三种方法

    出处:http://www.cnblogs.com/xwdreamer/p/3623454.html 引言 在ubuntu当中,安装应用程序我所知道的有三种方法,分别是apt-get,dpkg安装de ...

  6. 统一事件源epoll代码示例

    可以将信号注册进pipe管道的写端,通过对读端的监听,来实现统一事件源. #include <sys/types.h> #include <sys/socket.h> #inc ...

  7. 关于Java函数传参以及参数在函数内部改变的问题——JAVA值传递与引用最浅显的说明!

    看了很多关于阐述JAVA传参到底是值传递还是引用的问题,有些说得很肤浅让人感觉似懂非懂的感觉,但是好像又能解决一些问题,然后就止步了.还有一些则是,讲得很深奥,看着好像很有道理的样子,但是其实还是没怎 ...

  8. 【Linux日志】系统日志及分析

    Linux系统拥有非常灵活和强大的日志功能,可以保存几乎所有的操作记录,并可以从中检索出我们需要的信息. 大部分Linux发行版默认的日志守护进程为 syslog,位于 /etc/syslog 或 / ...

  9. CSS3学习笔记之属性值

    font-family 设置文本的字体名称. font-style 设置文本样式. 取值 normal不使用斜体. italic使用斜体. oblique使用倾斜体. inherit从父元素继承. f ...

  10. 通过yum安装nginx-mysql-php-fastcgi配置LNMP

    最近指想服务器跑静态文件,所以想单独配置个nginx的webserver,然而并不是我想象的那么简单,使用rpm包来安装会发生很多软件依赖的错误: 当我尝试使用yum安装nginx的时候,总是提示未找 ...