windows hbase installation
In the previous post, I have introduced how to install hadoop on windows based system. Now, I will introduce how to install hbase on windows.
1. Preparation:
before the installation, let's take a look at the hadoop-hbase suppot matrix below:

you can choose the apropriate version of hbase which is supported by your hadoop system. Because I have installed hadoop 2.7.1 in the previous post, so I choose to install hbase 1.3.1 which is supported by hadoop 2.7.1.
2. Download hbase 1.3.1.tar.gz from apache official site.
3. Unzip the hbase 1.3.1.tar.gz into your local computer.
On my computer, I unzipped in folder: C:\UserDefined\BigData\hbase-1.3.1 which is the hbase root.

4. Configuration
The configuration mainly about two files which located at %HBASE_HOME%\conf folder hbase-site.xml hbase-env.cmd
4.1) hbase-site.xml

<configuration>
<property>
<!-- hbase mater address-->
<name>hbase.master</name>
<value>localhost:6000</value>
</property>
<property>
<name>hbase.master.maxclockskew</name>
<value>180000</value>
</property>
<property>
<!-- hbase folder in hdfs-->
<name>hbase.rootdir</name>
<value>hdfs://localhost:9000/hbase</value>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>false</value>
</property>
<property>
<!-- ZK master address-->
<name>hbase.zookeeper.quorum</name>
<value>localhost</value>
</property>
<property>
<!-- hbase data folder in zookeeper-->
<name>hbase.zookeeper.property.dataDir</name>
<value>/hbase</value>
</property>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
4.2) hbase-env.cmd
set the JAVA_HOME path.
Note:
you can use fullpath of the jdk installation root dir. or if you have configurate the JAVA_HOME path, you can also reference the system variable.
because I have configurate the java_home variable, so I reference the system var directly.


5. Operation
5.1) stop hadoop if hadoop has been started. command as follows:
%hadoop_home%\sbin\stop-all.cmd
Note:
%hadoop_home% referes to the root dir of your hadoop install location.

5.2) start hadoop using command as follows:
%hadoop_home%\sbin\start-all.cmd
Note:
%hadoop_home% referes to the root dir of your hadoop install location.

5.3) start hbase using command as follows:
%hbase_home%start-hbase.cmd
Note:
%hbase_home% referes to the root dir of your hbase install location.


5.4) start hbase restfull service
%hbase_home%\hbase rest start -p 6000
Note:
%hbase_home% referes to the root dir of your hbase install location.

5.5 start hbase shell
%hbase_home%\hbase shell
Note:
%hbase_home% referes to the root dir of your hbase install location.

windows hbase installation的更多相关文章
- 【英文文档】Solidifier for Windows Installation Guide
Page 1Solidifier for Windows Installation Guide Page 2McAfee, Inc.McAfee® Solidifier for Windows In ...
- 使用Phoenix将SQL代码移植至HBase
1.前言 HBase是云计算环境下最重要的NOSQL数据库,提供了基于Hadoop的数据存储.索引.查询,其最大的优点就是可以通过硬件的扩展从而几乎无限的扩展其存储和检索能力.但是HBase与传统的基 ...
- UEFI Bootable USB Flash Drive - Create in Windows(WIN7 WIN8)
How to Create a Bootable UEFI USB Flash Drive for Installing Windows 7, Windows 8, or Windows 8.1 In ...
- Git在Windows环境下配置Diff以及Merge工具---DiffMerge
参考出处:http://coding4streetcred.com/blog/post/Configure-DiffMerge-for-Your-Git-DiffTool主要转自:http://blo ...
- windows 精简/封装/部署
给一个精简过的Windows7安装net35,提示自己到『打开或关闭Windows功能』里打开,然而发现并没有,只有一个ie9的功能.搜索尝试各种办法,显然都不行.用dism部署功能的工具,挂载一个完 ...
- HOWTO:制作 Windows 7 加速部署映像(作者:苏繁)
加速部署映像 - 也就是我们通常说的系统模板,通常我们为了提高 Windows 的安装速度,会事先制作一套包含驱动.应用软件.补丁程序以及自定义设置的标准化系统.这样我们在使用该加速部署映像完成安装后 ...
- windows tomcat配置大全
Tomcat下JSP.Servlet和JavaBean环境的配置 第一步:下载j2sdk和tomcat:到sun官方站点()下载j2sdk,注意下载版本为Windows Offline Install ...
- 安装Windows Metasploit Framework
Installing the Metasploit Framework on Windows 1. Visit http://windows.metasploit.com/metasploitfram ...
- Mac下安装HBase及详解
Mac下安装HBase及详解 1. 千篇一律的HBase简介 HBase是Hadoop的数据库, 而Hive数据库的管理工具, HBase具有分布式, 可扩展及面向列存储的特点(基于谷歌BigTabl ...
随机推荐
- Appium 并发多进程基于 Pytest框架
前言: 之前通过重写unittest的初始化方法加入设备参数进行并发,实现了基于unittest的appium多设备并发,但是考虑到unittest的框架实在过于简陋,也不方便后期的Jenkins的持 ...
- [LeetCode&Python] Problem 561. Array Partition I
Given an array of 2n integers, your task is to group these integers into n pairs of integer, say (a1 ...
- zoj 1108 FatMouse's Speed 基础dp
FatMouse's Speed Time Limit: 2 Seconds Memory Limit:65536 KB Special Judge FatMouse believe ...
- HDU 2094:产生冠军(拓扑排序)
产生冠军 Time Limit: 1000/1000 MS (Java/Others) Memory Limit: 32768/32768 K (Java/Others) Total Submi ...
- 51Nod:活动安排问题之二(贪心)
有若干个活动,第i个开始时间和结束时间是[Si,fi),同一个教室安排的活动之间不能交叠,求要安排所有活动,最少需要几个室? 输入 第一行一个正整数n (n <= 10000)代表活动的个数. ...
- maven quick start
mvn archetype:generate -DgroupId=com.mycompany.app -DartifactId=my-app -DarchetypeArtifactId=maven-a ...
- Hadoop之HDFS
摘要:HDFS是Hadoop的核心模块之一,围绕HDFS是什么.HDFS的设计思想和HDFS的体系结构三方面来介绍. 关键词:Hadoop HDFS 分布式存储系统 HDFS是Hadoop的核心 ...
- 使用VI编辑器在Linux下编写Java文件
1.cd 文件名称.进入一个文件夹下 2.vi 文件名称,新建一个文件(如此文件已存在则打开) watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvbWlfc3N ...
- Openssl将crt证书和key私钥合成pfx证书
下载OpenSSL地址:http://slproweb.com/products/Win32OpenSSL.html 下载安装openssl 选择对应OpenSSL版本进行下载下载. 运行安装程序Wi ...
- 使用 extract-text-webpack-plugin 报错:Error: Chunk.entry was removed. Use hasRuntime()
问题:使用 extract-text-webpack-plugin 报错:Error: Chunk.entry was removed. Use hasRuntime() 解决:先运行npm unin ...