reference: http://dblab.xmu.edu.cn/blog/install-hbase/

reference: http://dblab.xmu.edu.cn/blog/2139-2/

wget sudo wget http://archive.apache.org/dist/hbase/1.1.5/hbase-1.1.5-bin.tar.gz
sudo tar zvxf hbase-1.1.5-bin.tar.gz
sudo mv hbase-1.1.5 hbase
sudo chown -R hadoop ./hbase

./hbase/bin/hbase version
2019-01-25 17:23:11,310 INFO [main] util.VersionInfo: HBase 1.1.5
2019-01-25 17:23:11,311 INFO [main] util.VersionInfo: Source code repository git://diocles.local/Volumes/hbase-1.1.5/hbase revision=239b80456118175b340b2e562a5568b5c744252e
2019-01-25 17:23:11,311 INFO [main] util.VersionInfo: Compiled by ndimiduk on Sun May 8 20:29:26 PDT 2016
2019-01-25 17:23:11,312 INFO [main] util.VersionInfo: From source with checksum 7ad8dc6c5daba19e4aab081181a2457d

hadoop@iZuf68496ttdogcxs22w6sZ:/usr/local$ cat ~/.bashrc

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export JRE_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib
export PATH=$PATH:${JAVA_HOME}/bin:/usr/local/hbase/bin

hadoop@iZuf68496ttdogcxs22w6sZ:/usr/local/hbase/conf$ vim hbase-site.xml

<configuration>
<property>
<name>hbase.rootdir</name>
<value>hdfs://localhost:9000/hbase</value>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
</configuration>

hadoop@iZuf68496ttdogcxs22w6sZ:/usr/local/hbase$ grep -v ^# ./conf/hbase-env.sh | grep -v ^$
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export HBASE_CLASSPATH=/usr/local/hadoop/conf
export HBASE_OPTS="-XX:+UseConcMarkSweepGC"
export HBASE_MASTER_OPTS="$HBASE_MASTER_OPTS -XX:PermSize=128m -XX:MaxPermSize=128m"
export HBASE_REGIONSERVER_OPTS="$HBASE_REGIONSERVER_OPTS -XX:PermSize=128m -XX:MaxPermSize=128m"
export HBASE_MANAGES_ZK=true

hadoop@iZuf68496ttdogcxs22w6sZ:/usr/local/hbase$ ./bin/stop-hbase.sh
stopping hbase...............
localhost: stopping zookeeper.

hadoop@iZuf68496ttdogcxs22w6sZ:/usr/local/hbase$ jps
1648 SecondaryNameNode
1329 NameNode
8795 Jps
1471 DataNode
hadoop@iZuf68496ttdogcxs22w6sZ:/usr/local/hbase$ ./bin/start-hbase.sh
localhost: starting zookeeper, logging to /usr/local/hbase/bin/../logs/hbase-hadoop-zookeeper-iZuf68496ttdogcxs22w6sZ.out
starting master, logging to /usr/local/hbase/bin/../logs/hbase-hadoop-master-iZuf68496ttdogcxs22w6sZ.out
OpenJDK 64-Bit Server VM warning: ignoring option PermSize=128m; support was removed in 8.0
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
starting regionserver, logging to /usr/local/hbase/bin/../logs/hbase-hadoop-1-regionserver-iZuf68496ttdogcxs22w6sZ.out

note:we need to set hdfs dfs.replication=3

start hbase normal, we can find it in hdfs web page.

hbase(main):010:0> describe 'student'
Table student is ENABLED
student
COLUMN FAMILIES DESCRIPTION
{NAME => 'Sage', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', COMPRESSION => 'NONE', MIN_VERSIONS => '0', BL
OCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'}
{NAME => 'Sdept', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', COMPRESSION => 'NONE', MIN_VERSIONS => '0', B
LOCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'}
{NAME => 'Sname', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', COMPRESSION => 'NONE', MIN_VERSIONS => '0', B
LOCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'}
{NAME => 'Ssex', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', COMPRESSION => 'NONE', MIN_VERSIONS => '0', BL
OCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'}
{NAME => 'course', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', COMPRESSION => 'NONE', MIN_VERSIONS => '0',
BLOCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'}
5 row(s) in 0.0330 seconds

hbase(main):011:0> disable 'student'
0 row(s) in 2.4280 seconds

hbase(main):012:0> drop 'student'
0 row(s) in 1.2700 seconds

hbase(main):013:0> list
TABLE
0 row(s) in 0.0040 seconds

=> []
hbase(main):002:0> create 'student','Sname','Ssex','Sage','Sdept','course'
0 row(s) in 1.3740 seconds

=> Hbase::Table - student
hbase(main):003:0> scan 'student'
ROW COLUMN+CELL
0 row(s) in 0.1200 seconds

hbase(main):004:0> describe 'student'
Table student is ENABLED
student
COLUMN FAMILIES DESCRIPTION
{NAME => 'Sage', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', COMPRESSION => 'NONE', MIN_VERSIONS => '0', BL
OCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'}
{NAME => 'Sdept', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', COMPRESSION => 'NONE', MIN_VERSIONS => '0', B
LOCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'}
{NAME => 'Sname', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', COMPRESSION => 'NONE', MIN_VERSIONS => '0', B
LOCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'}
{NAME => 'Ssex', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', COMPRESSION => 'NONE', MIN_VERSIONS => '0', BL
OCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'}
{NAME => 'course', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', COMPRESSION => 'NONE', MIN_VERSIONS => '0',
BLOCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'}
5 row(s) in 0.0420 seconds

hbase(main):005:0> put 'student', '95001', 'Sname', 'LiYing'
0 row(s) in 0.1100 seconds

hbase(main):006:0> scan 'student'
ROW COLUMN+CELL
95001 column=Sname:, timestamp=1548640148767, value=LiYing
1 row(s) in 0.0220 seconds

hbase(main):007:0> put 'student','95001','course:math','80'
0 row(s) in 0.0680 seconds

hbase(main):008:0> scan 'student'
ROW COLUMN+CELL
95001 column=Sname:, timestamp=1548640148767, value=LiYing
95001 column=course:math, timestamp=1548640220401, value=80
1 row(s) in 0.0170 seconds

hbase(main):009:0> delete 'student','95001','Sname'
0 row(s) in 0.0300 seconds

hbase(main):011:0> scan 'student'
ROW COLUMN+CELL
95001 column=course:math, timestamp=1548640220401, value=80
1 row(s) in 0.0170 seconds

hbase(main):012:0> get 'student','95001'
COLUMN CELL
course:math timestamp=1548640220401, value=80
1 row(s) in 0.0260 seconds

hbase(main):013:0> deleteall 'student','95001'
0 row(s) in 0.0180 seconds

hbase(main):014:0> scan 'student'
ROW COLUMN+CELL
0 row(s) in 0.0170 seconds

hbase(main):015:0> disable 'student'
0 row(s) in 2.2720 seconds

hbase(main):017:0> drop 'student'
0 row(s) in 1.2530 seconds

hbase(main):018:0> list
TABLE
0 row(s) in 0.0060 seconds

=> []

  

hadoop hbase install (2)的更多相关文章

  1. Hadoop,HBase,Zookeeper源码编译并导入eclipse

    基本理念:尽可能的参考官方英文文档 Hadoop:  http://wiki.apache.org/hadoop/FrontPage HBase:  http://hbase.apache.org/b ...

  2. [精华]Hadoop,HBase分布式集群和solr环境搭建

    1. 机器准备(这里做測试用,目的准备5台CentOS的linux系统) 1.1 准备了2台机器,安装win7系统(64位) 两台windows物理主机: 192.168.131.44 adminis ...

  3. [推荐]Hadoop+HBase+Zookeeper集群的配置

    [推荐]Hadoop+HBase+Zookeeper集群的配置 Hadoop+HBase+Zookeeper集群的配置  http://wenku.baidu.com/view/991258e881c ...

  4. hbase(ERROR: org.apache.hadoop.hbase.ipc.ServerNotRunningYetException: Server is not running yet)

    今天启动clouder manager集群时候hbase list出现 (ERROR: org.apache.hadoop.hbase.ipc.ServerNotRunningYetException ...

  5. Cloudera集群中提交Spark任务出现java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.addFamily错误解决

    Cloudera及相关的组件版本 Cloudera: 5.7.0 Hbase: 1.20 Hadoop: 2.6.0 ZooKeeper: 3.4.5 就算是引用了相应的组件依赖,依然是报一样的错误! ...

  6. 【解决】org.apache.hadoop.hbase.ClockOutOfSyncException:

    org.apache.hadoop.hbase.ClockOutOfSyncException: org.apache.hadoop.hbase.ClockOutOfSyncException: Se ...

  7. org.apache.hadoop.hbase.TableNotDisabledException 解决方法

    Exception in thread "main" org.apache.hadoop.hbase.TableNotDisabledException: org.apache.h ...

  8. Java 向Hbase表插入数据报(org.apache.hadoop.hbase.client.HTablePool$PooledHTable cannot be cast to org.apac)

    org.apache.hadoop.hbase.client.HTablePool$PooledHTable cannot be cast to org.apac 代码: //1.create HTa ...

  9. Hadoop,HBase集群环境搭建的问题集锦(四)

    21.Schema.xml和solrconfig.xml配置文件里參数说明: 參考资料:http://www.hipony.com/post-610.html 22.执行时报错: 23., /comm ...

随机推荐

  1. 0xc0000005:读取位置时发生访问冲突

    这是空指针,比如: A* a=NULL; a->fun();//会提示标题错误,因为a没有分配空间

  2. CF 316E3 Summer Homework(斐波那契矩阵+线段树)

    题目链接:http://codeforces.com/problemset/problem/316/E3 题意:一个数列A三种操作:(1)1 x y将x位置的数字修改为y:(2)2 x y求[x,y] ...

  3. 利用Qt开发跨平台APP

    本文将手把手教你如何在Windows环境下,使用Qt编译出安卓应用程序. Qt是一个优秀的跨平台开发工具.我们利用Qt可以很方便地将一次编写的应用,多次编译到不同平台上,如Windows.Linux. ...

  4. 【第二十一章】 springboot + 定时任务

    1.application.properties #cron job.everysecond.cron=0/1 * * * * * job.everytensecond.cron=0/10 * * * ...

  5. 《EMCAScript6入门》读书笔记——24.编程风格

  6. github上不去了

    这几天发现github.com上不去了 可能是由于要杜绝国外的人使用最新的技术??从而屏蔽了吗?

  7. 初识C++继承

    先是自己凭借自己在课堂上的记忆打了一遍.自然出了错误. //编译错误 #include <iostream> #include <cstdlib> using namespac ...

  8. 【Python】【 接口自动化测试】【一】环境搭建

    1. 环境配置 我电脑Windows7 64位  +  Python2.7  + Oracle客户端 10.2 + cx_Oracle 10g Oracle客户端下载(为此我还申请个Oracle账号) ...

  9. 【Django】【六】接口自动化测试框架

    我的源码地址:https://github.com/woshixiaoyu202017/djangoTest 详细构建步骤如下 1. 生成新的测试数据库的数据库表结构guest_test 2. 数据库 ...

  10. Codeforces Round #323 (Div. 2) D. Once Again... 乱搞+LIS

    D. Once Again... time limit per test 1 second memory limit per test 256 megabytes input standard inp ...