The Hadoop on Azure Sqoop Import Sample Tutorial

Overview

This tutorial shows how to use Sqoop to import data from a SQL database on Windows Azure to an Hadoop on Azure HDFS cluster.

While Hadoop is a natural choice for processing unstructured and
semi-structured data, such as logs and files, there may also be a need
to process structured data stored in relational databases.

Sqoop is a tool designed to transfer data between Hadoop and
relational databases. You can use it to import data from a relational
database management system (RDBMS) such as SQL or MySQL or Oracle into
the Hadoop Distributed File System (HDFS), transform
the data in Hadoop with MapReduce or Hive, and then export the data
back into an RDBMS. In this tutorial, you are using a SQL Database for
your relational database.

Sqoop is an open source software product of Cloudera, Inc. Software development for Sqoop has recently moved from gitHub to the Apache
Sqoop
 site.

In Hadoop on Azure, Sqoop is deployed from the Hadoop Command Shell on
the head node of the HDFS cluster. You use the Remote Desktop feature
available in the Hadoop on Azure portal to access the head node of the
cluster for this deployment.

Goals

In this tutorial, you see three things:

  1. How to set up a SQL database on Windows Azure for use with the tutorial.

  2. How to use the Remote Desktop feature in Hadoop on Azure to access the head node of the HDFS cluster.

  3. How to import relational data from SQL Server to a Hadoop on Azure HDFS cluster by using Sqoop.

Key technologies

Setup and Configuration

You must have an account to access Hadoop on Azure and have created a
cluster to work through this tutorial. To obtain an account and create
an Hadoop cluster, follow the instructions outlined in the
Getting started with Microsoft Hadoop on Azure section of the Introduction to Hadoop on Azure topic.

You will also need your outward facing IP address for your current
location when configuring your firewall on SQL Database. To obtain it,
go to the site
WhatIsMyIP
and make a note of it. Later in the procedure, you also need the
outward facing IP address for the head of the Hadoop cluster. You can
obtain this IP address in the same way.


Tutorial

This tutorial is composed of the following segments:

  1. How to set up a SQL database.

  2. How to use Sqoop from Hadoop on Azure to import data to the HDFS cluster.

How to set up a SQL database

Log in into your Windows Azure account. To create a database server, click the
Database icon in the lower left-hand corner on the page.

On the Getting Started page, click the Create a new SQL Database Server option.

Select the type of subscription (such as Pay-As-You-Go) associated with you account in the
Create Server window and press Next.

Select the appropriate Region in the Create Server window and click
Next.

Specify the login and password of the server-level principal of your SQL Database server and then press
Next.

Press Add to specify a firewall rule that allows
your current location access to SQL Database to upload the
AdventureWorks database. The firewall grants access based on the
originating IP address of each request. Use the IP address found
with the configuration preliminaries of this tutorial for the values to
add. Specify a Rule name, such as shown, but remember to use your IP
address, not the one used for illustration purposes below. (You must
also add the outward IP address of the head node
in you Hadoop cluster. If you know it already, add it now.) Then press
the
Finish
button.

Download the AdventureWorks2012 database onto your local machine from
Recommended Downloads link on the Adventure Works for SQL Database
site.

Unzip the file, open an Administrator Command Prompt, and navigate to
the AdventureWorks directory inside the AdventureWorks2012ForSQLAzure
folder.

Run CreateAdventureWorksForSQLAzure.cmd by typing the following:

CreateAdventureWorksForSQLAzure.cmd servername username password

For example, if the assigned SQL Database server is named b1gl33p,
the administrator user name "Fred", and the password "Secret", you would
type the following:

CreateAdventureWorksForSQLAzure.cmd b1gl33p.database.windows.net Fred@b1gl33p Secret

The script creates the database, installs the schema, and populates the database with sample data.

Return to the WindowsAzurePlatform portal page, click your subscription on the left-hand side (Pay-As-You-Go in the example below) and select your database (here named wq6xlbyoq0). The AventureWorks2012 should be listed
in the Database Name column. Select it and press the Manage icon at the top of the page.

Enter the credentials for the SQL database when prompted and press Log on.

This opens the Web interface for the Adventure Works database on SQL Database. Press the
New Query icon at the top to open the query editor.

Since Sqoop currently adds square brackets to the table name, we need
to add a synonym to support two-part naming for SQL Server tables. To
do so, run the following query:

CREATE SYNONYM [Sales.SalesOrderDetail] FOR Sales.SalesOrderDetail

Run the following query and review its result.

select top 200 * from [Sales.SalesOrderDetail]

How to use Sqoop from Hadoop on Azure to import SQL Database query results to the HDFS cluster in Hadoop
On Azure.

From your Account page, scroll down to the Open Ports icon in the
Your cluster section and click the icon to open the ODBC Server port on the head node in your cluster.

Return to your Account page, scroll down to the Your cluster section and click the
Remote Desktop icon this time to open the head node in your cluster.

Select Open when prompted to open the .rdp file.

Select Connect in the Remote Desktop Connection window.

Enter your credentials for the Hadoop cluster (not your Hadoop on Azure account) into the
Windows Security window and select OK.

Open Internet Explorer and go to the site WhatIsMyIP
to obtain the outward facing IP address for the head node of the
cluster. Return the SQL Database management page and add a firewall rule
that allows your Hadoop cluster access to SQL Database. The firewall
grants access based on the originating
IP address of each request.

Double-click on the Hadoop Command Shell icon in the upper left hand of the Desktop to open it. Navigate to the
"c:\Apps\dist\sqoop\bin" directory and run the following command:

sqoop import --connect
"jdbc:sqlserver://[serverName].database.windows.net;username=[userName]@[serverName];password=[password];database=AdventureWorks2012"
--table Sales.SalesOrderDetail --target-dir /data/lineitemData -m 1

So, for example, for the following values:
* server name: wq6xlbyoq0
* username: HadoopOnAzureSqoopAdmin
* password: Pa$$w0rd

The sqoop command is:

sqoop import --connect
"jdbc:sqlserver://wq6xlbyoq0.database.windows.net;username=HadoopOnAzureSqoopAdmin@wq6xlbyoq0;password=Pa$$w0rd;;database=AdventureWorks2012"
--table Sales.SalesOrderDetail --target-dir /data/lineitemData -m 1

Return to the Accounts page of the Hadoop on Azure portal and open the
Interactive Console this time. Run the #lsr command from the JavaScript console to list the files and directories on your HDFS cluster.

Run the #tail command to view selected results from the part-m-0000 file.

tail /user/RAdmin/data/SalesOrderDetail/part-m-00000


Summary

In this tutorial, you have seen how to use Sqoop to import data from a
SQL database on Windows Azure to an Hadoop on Azure HDFS cluster.

sql server 导出数据到 Azure Hbase / Hive 详细步骤的更多相关文章

  1. sql server导出数据,远程连接失败,需要设置权限

    在sql  server management中右键当前连接——>方面 在 服务器配置中 将  RemoteAccessEnabled.RemoteDacEnabled设置为TRUE 安全性—— ...

  2. SQL Server 导出数据到 PostgreSQL

    乘着倒数据这会儿,把方法记录一下 需求:因为数据迁移,需要将SQL Server 2012中的数据库导入到PostgreSQL 数据库中 思路:创建一个空的数据库,便于导入数据.下载PostgreSQ ...

  3. sql server导出数据,详细操作!(自用)

    右键数据库——>任务——>导出数据 填写数据源连接信息 填写 要导入的数据库  连接信息 导出表时,全选,[编辑映射],勾选启用标识插入,这样才能让自增的字段 行正常插入. 然后导出即可. ...

  4. 关于SQL SERVER导出数据的问题!

    前面一段时间,为这个导出数据真是煞费苦心,网上找了好多资料都没有找到. 从SQL SERVER 2008开始,我们就可以很方便的导出数据脚本,而无需再借助存储过程,但是SQL Server 2012和 ...

  5. SQL SERVER 导出数据,数据与结构,结构

    1.右键数据库->任务->生成脚本 2.选择数据库对象,可以整个表,也可以选择部分表 3.下一步,设置脚本编写选项.选择高级,在高级中,倒数第二项,'要编写脚本的数据的类型'中,可以选择导 ...

  6. sql server导出数据,本地数据库远程连接不上,怎样设置防火墙(自用)

    控制面板——>系统安全——>windows防火墙——>高级设置 新建入站规则: 将一下两个应用 允许入站: D:\Program Files (x86)\Microsoft SQL ...

  7. 非域环境下SQL Server搭建Mirror(镜像)的详细步骤

    1.测试验证环境 服务器角色 机器名 IP SQL Server Ver 主体服务器 WIN-TestDB4O 172.83.XXX.XXX SQL Server 2012 - 11.0.5058.0 ...

  8. 在SQL Server中将数据导出为XML和Json

        有时候需要一次性将SQL Server中的数据导出给其他部门的也许进行关联或分析,这种需求对于SSIS来说当然是非常简单,但很多时候仅仅需要一次性导出这些数据而建立一个SSIS包就显得小题大做 ...

  9. 不同版本的SQL Server之间数据导出导入的方法及性能比较

    原文:不同版本的SQL Server之间数据导出导入的方法及性能比较 工作中有段时间常常涉及到不同版本的数据库间导出导入数据的问题,索性整理一下,并简单比较下性能,有所遗漏的方法也欢迎讨论.补充. 0 ...

随机推荐

  1. QT笔记之自定义窗口拖拽移动

    1.QT自定义标题栏,拖拽标题栏移动窗口(只能拖拽标题,其他位置无法拖拽) 方法一: 转载:http://blog.sina.com.cn/s/blog_4ba5b45e0102e83h.html . ...

  2. cvs版本控制器

    CVS 版本控制器   首先我们要来明确 :为什么要学习CVS •项目开发靠的是一个团队的能力,很少有大中型项目是由个人完成的.对于团队开发来讲---能控制每个人的分工和权限, 可以让多个人同时编辑同 ...

  3. 【leetcode❤python】66. Plus One

    class Solution:      # @param digits, a list of integer digits      # @return a list of integer digi ...

  4. [3D] 基本概念

    [3D] 基本概念 环境光:对场景中所有的对象都提供了固定不变的照明.点光源:是从一个点发出的光.灯泡就可以理解为点光源.聚光源:正如它的的名字一样,是有方向和强弱的,电筒就是典型的聚光源. 方向光: ...

  5. mysqldump使用方法(MySQL数据库的备份与恢复)

    #mysqldump --help 1.mysqldump的几种常用方法: (1)导出整个数据库(包括数据库中的数据) mysqldump -u username -p dbname > dbn ...

  6. SQL生成包含年月日的流水号

    --************************************************************************************************** ...

  7. hibernate缓存说明

    hibernate缓存说明: 1.一级缓存(session级别缓存)     一级缓存,不是用来提升性能,是用来处理事务的 2.二级缓存(sessionFactory级别缓存):     二级缓存,对 ...

  8. UpdatePanel的简单用法(转)

    微软AJAX虽然是过时的玩意,但是得维护公司之前的老项目,转载看看. 局部更新是ajax技术的最基本,也是最重要的用法,今天大概把asp.net ajax中的局部更新控件 updatepanel的用法 ...

  9. phpStorm快捷键

    1.快速寻找方法,变量定义处:ctrl + b或者ctrl+单击 2. 移动视图,方便快捷的移动代码窗口: ctrl + up, down 3. 代码方法间快速跳转:alt + up, down

  10. equals()和hashcode()

    默认调用的情况: 1.集合在存放对象时,首先判断hashcode(),再判断equals如果都是true,认为是相同的两个元素不进行存储. 删除对象时,将从hashcode指定位置查找再删除 2.在h ...