1)本地目录/home/hadoop/test下的test4.txt文件内容(每行数据之间用tab键隔开)如下所示:

[hadoop@master test]$ sudo vim test4.txt

    dajiangtai
hadoop
hive
hbase
spark

2)启动hiveserver2

[hadoop@master test]$ cd ${HIVE_HOME}/bin
[hadoop@master bin]$ ll
total
-rwxr-xr-x hadoop hadoop Jan beeline
drwxr-xr-x hadoop hadoop May : ext
-rwxr-xr-x hadoop hadoop Jan hive
-rwxr-xr-x hadoop hadoop Jan hive-config.sh
-rwxr-xr-x hadoop hadoop Jan hiveserver2
-rwxr-xr-x hadoop hadoop Jan metatool
-rwxr-xr-x hadoop hadoop Jan schematool [hadoop@master bin]$ ./hiveserver2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/modules/hadoop-2.6./share/hadoop/common/lib/slf4j-log4j12-1.7..jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/modules/hive1.0.0/lib/hive-jdbc-1.0.-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

3) 程序代码

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
public class Hive {
private static String driverName = "org.apache.hive.jdbc.HiveDriver";//hive驱动名称
private static String url = "jdbc:hive2://master:10000/default";//连接hive2服务的连接地址,Hive0.11.0以上版本提供了一个全新的服务:HiveServer2
private static String user = "hadoop";//对HDFS有操作权限的用户
private static String password = "";//在非安全模式下,指定一个用户运行查询,忽略密码
private static String sql = "";
private static ResultSet res;
public static void main(String[] args) {
try {
Class.forName(driverName);//加载HiveServer2驱动程序
Connection conn = DriverManager.getConnection(url, user, password);//根据URL连接指定的数据库
Statement stmt = conn.createStatement(); //创建的表名
String tableName = "testHiveDriverTable"; /** 第一步:表存在就先删除 **/
sql = "drop table " + tableName;
stmt.execute(sql); /** 第二步:表不存在就创建 **/
sql = "create table " + tableName + " (key int, value string) row format delimited fields terminated by '\t' STORED AS TEXTFILE";
stmt.execute(sql); // 执行“show tables”操作
sql = "show tables '" + tableName + "'";
res = stmt.executeQuery(sql);
if (res.next()) {
System.out.println(res.getString());
} // 执行“describe table”操作
sql = "describe " + tableName;
res = stmt.executeQuery(sql);
while (res.next()) {
System.out.println(res.getString() + "\t" + res.getString());
} // 执行“load data into table”操作
String filepath = "/home/hadoop/test/test4.txt";//hive服务所在节点的本地文件路径
sql = "load data local inpath '" + filepath + "' into table " + tableName;
stmt.execute(sql); // 执行“select * query”操作
sql = "select * from " + tableName;
res = stmt.executeQuery(sql);
while (res.next()) {
System.out.println(res.getInt() + "\t" + res.getString());
} // 执行“regular hive query”操作,此查询会转换为MapReduce程序来处理
sql = "select count(*) from " + tableName;
res = stmt.executeQuery(sql);
while (res.next()) {
System.out.println(res.getString());
}
conn.close();
conn = null;
} catch (ClassNotFoundException e) {
e.printStackTrace();
System.exit();
} catch (SQLException e) {
e.printStackTrace();
System.exit();
}
}
}

4) 运行结果(右击-->Run as-->Run on Hadoop)

此时直接运行会报错,解决方案请见下一篇博文:HiveSQLException: Error while compiling statement: No privilege 'Create' found for outputs { database:default }

运行日志如下:

-- ::, INFO [org.apache.hive.jdbc.Utils] - Supplied authorities: master:
-- ::, INFO [org.apache.hive.jdbc.Utils] - Resolved authority: master:
-- ::, INFO [org.apache.hive.jdbc.HiveConnection] - Will try to open client transport with JDBC Uri: jdbc:hive2://master:10000/default
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - opening transport org.apache.thrift.transport.TSaslClientTransport@3834d63f
-- ::, DEBUG [org.apache.thrift.transport.TSaslClientTransport] - Sending mechanism name PLAIN and initial response of length
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Writing message with status START and payload length
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Writing message with status COMPLETE and payload length
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Start message handled
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Main negotiation loop complete
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: SASL Client receiving last message
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Received message with status COMPLETE and payload length
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
testhivedrivertable
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
key int
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
value string
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
1 dajiangtai
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
2 hadoop
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
3 hive
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
4 hbase
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
5 spark
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: -- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length:
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length:

执行“show tables”运行结果:

testhivedrivertable

执行“describe table”运行结果:

key    int
value string

执行“select * query”运行结果:

    dajiangtai
hadoop
hive
hbase
spark
或者从集群上查看运行结果。
hive> show tables;
OK
copy_student1
copy_student2
copy_student3
copy_student4
employee
group_gender_agg
group_gender_sum
group_test
index_test
index_tmp
partition_test
student1
student2
test
test_view
testhivedrivertable
user
Time taken: 0.153 seconds, Fetched: row(s)
hive> desc testhivedrivertable;
OK
key int
value string
Time taken: 0.184 seconds, Fetched: row(s)
hive> select * from testhivedrivertable;
OK
1 dajiangtai
2 hadoop
3 hive
4 hbase
5 spark
Time taken: 0.346 seconds, Fetched: row(s)

以上就是博主为大家介绍的这一板块的主要内容,这都是博主自己的学习过程,希望能给大家带来一定的指导作用,有用的还望大家点个支持,如果对你没用也望包涵,有错误烦请指出。如有期待可关注博主以第一时间获取更新哦,谢谢! 

版权声明:本文为博主原创文章,未经博主允许不得转载。

												

Hive:JDBC示例的更多相关文章

  1. Hive和Jdbc示例

    重要:在使用 JDBC 开发 Hive 程序时, 必须首先开启 Hive 的远程服务接口.使用下面命令进行开启:hive -service hiveserver & 1). 测试数据 user ...

  2. 三、hive JavaAPI示例

    在上文中https://www.cnblogs.com/lay2017/p/9973370.html 我们通过hive shell去操作hive,本文我们以Java代码的示例去对hive执行加载数据和 ...

  3. Hive JDBC——深入浅出学Hive

    第一部分:搭建Hive JDBC开发环境 搭建:Steps •新建工程hiveTest •导入Hive依赖的包 •Hive  命令行启动Thrift服务 •hive --service hiveser ...

  4. Hive学习之六 《Hive进阶— —hive jdbc》 详解

    接Hive学习五 http://www.cnblogs.com/invban/p/5331159.html 一.配置环境变量 hive jdbc的开发,在开发环境中,配置Java环境变量 修改/etc ...

  5. Hive 8、Hive2 beeline 和 Hive jdbc

    1.Hive2 beeline  Beeline 要与HiveServer2配合使用,支持嵌入模式和远程模式 启动beeline 打开两个Shell窗口,一个启动Hive2 一个beeline连接hi ...

  6. Hive JDBC:java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate anonymous

    今天使用JDBC来操作Hive时,首先启动了hive远程服务模式:hiveserver2 &(表示后台运行),然后到eclipse中运行程序时出现错误: java.sql.SQLExcepti ...

  7. 使用Spring Boot操作Hive JDBC时,启动时报出错误:NoSuchMethodError: org.eclipse.jetty.servlet.ServletMapping.setDef

    使用Spring Boot操作Hive JDBC时,启动时报出错误:NoSuchMethodError: org.eclipse.jetty.servlet.ServletMapping.setDef ...

  8. hive JDBC异常到多租户

    hive jdbc执行select count(*) from test报错. return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedT ...

  9. Hive jdbc连接出现java.sql.SQLException: enabling autocommit is not supported

    1.代码如下 String url = "jdbc:hive2://master135:10000/default"; String user = "root" ...

随机推荐

  1. flume入门之一:flume 安装及测试

    http://flume.apache.org/ flume下载:http://mirror.bit.edu.cn/apache/flume/1.7.0/apache-flume-1.7.0-bin. ...

  2. RS485总线防雷保护方案

    RS485作为最为最常用的电表通讯方式之一.日常生活中雷电和静电干扰已经成为485通信总线在实际工程经常遇到的问题.故如何对芯片以及总线进行有效的保护,是摆在每一个使用者面前的一个问题.在这里,我们主 ...

  3. Uboot启动参数说明

    bootcmd=cp.b 0xc4200000 0x7fc0 0x200000 ; bootm // 倒计时到 0 以后,自动执行的指令 bootdelay=2 baudrate=38400 // 串 ...

  4. SpringMVC执行流程简介

    1.用户向服务器发送请求,请求被SpringMVC的前端控制器DispatcherServlet截获. 2.DispatcherServlet对请求的URL(统一资源定位符)进行解析,得到URI(请求 ...

  5. 使用SVG + CSS实现动态霓虹灯文字效果

    效果图: 原理:多个SVG描边动画使用不同的animation-delay即可! 对于一个形状SVG元素或文本SVG元素,可以使用stroke-dasharray来控制描边的间隔样式,并且可以用str ...

  6. sql server 表索引碎片处理

    DBCC SHOWCONTIG (Transact-SQL) SQL Server 2005 其他版本 更新日期: 2007 年 9 月 15 日 显示指定的表或视图的数据和索引的碎片信息. 重要提示 ...

  7. 使用struts2进行文件下载以及下载权限控制的例子

    本测试有两个模块,一个是文件上上传,一个是文件下载,文件下载的时候会检查是否足有权限,如果没有,就会转发到登录页面,如果有权限,就会直接启动下载程序,给浏览器一个输出流. 下面直接上我的代码: 登录表 ...

  8. R语言 arules包 apriori()函数中文帮助文档(中英文对照)

    apriori(arules) apriori()所属R语言包:arules                                         Mining Associations w ...

  9. linux 环境变量恢复默认值

    export PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin 在linux命令下如何访问一个ur ...

  10. hdu1086

    #include <iostream>#include <stdio.h>#include <string.h>#include <stack>#inc ...