2.13 Hive中自带Function使用及自定义UDF编程
UDF:User Definition Function
一、function
#查看自带的函数
hive (db_hive)> show functions; #查看一个函数的详细用法
hive (db_hive)> desc function extended split;
OK
tab_name
split(str, regex) - Splits str around occurances that match regex
Example:
> SELECT split('oneAtwoBthreeC', '[ABC]') FROM src LIMIT 1;
["one", "two", "three"]
Time taken: 0.005 seconds, Fetched: 4 row(s)
二、UDF
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF
https://cwiki.apache.org/confluence/display/Hive/HivePlugins #自定义UDF
Hive自带了一些函数,比如:max/min等,但是数量有限,自己可以通过自定义UDF来方便的扩展。 UDF:用户自定义函数,允许用户扩展HiveQL功能; ##
UDF(User-Defined-Function)
一进一出 UDAF(User-Defined Aggregation Funcation)
聚集函数,多进一出;类似于:count/max/min UDTF(User-Defined Table-Generating Functions)
一进多出;如lateral view explore() 编程步骤:
1、继承org.apache.hadoop.hive.ql.UDF
2、需要实现evaluate函数;evaluate函数支持重载; 注意事项:
1、UDF必须要有返回类型,可以返回null,但是返回类型不能为void;
2、UDF中常用Text/LongWritable等类型,不推荐使用java类型;
创建一个UDF-方式一:
1、Creating Custom UDFs
### LowerUDF.java###
package com.beifeng.senior.hive.udf; import org.apache.hadoop.hive.ql.exec.UDF;
import org.apache.hadoop.io.Text; /**
* 1. Implement one or more methods named
* "evaluate" which will be called by Hive.
*
* 2."evaluate" should never be a void method. However it can return "null" if
* needed.
* @author root
*
*/ public class LowerUDF extends UDF{ public Text evaluate(Text str) {
//validate
if(null == str.toString()) {
return null;
}
//lower
return new Text (str.toString().toLowerCase());
} public static void main(String[] args) {
System.out.println(new LowerUDF().evaluate(new Text("HIVE")));
}
} #然后打成jar包
[root@hadoop-senior datas]# pwd
/opt/datas
[root@hadoop-senior datas]# ls hiveudf.jar
hiveudf.jar
2、usage
#添加
hive (db_hive)> add jar /opt/datas/hiveudf.jar;
Added /opt/datas/hiveudf.jar to class path
Added resource: /opt/datas/hiveudf.jar #注册,my_lower是要注册的函数名,com.beifeng.senior.hive.udf.LowerUDF是类名
hive (db_hive)> create temporary function my_lower as "com.beifeng.senior.hive.udf.LowerUDF";
OK
Time taken: 0.012 seconds #查看
hive (db_hive)> show functions;
...
my_lower
... #测试使用
hive (db_hive)> select ename, my_lower(ename) lowername from emp limit 5;
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1554717689707_0031, Tracking URL = http://hadoop-senior.ibeifeng.com:8088/proxy/application_1554717689707_0031/
Kill Command = /opt/modules/hadoop-2.5.0/bin/hadoop job -kill job_1554717689707_0031
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2019-04-24 15:32:42,268 Stage-1 map = 0%, reduce = 0%
2019-04-24 15:32:47,387 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.28 sec
MapReduce Total cumulative CPU time: 1 seconds 280 msec
Ended Job = job_1554717689707_0031
MapReduce Jobs Launched:
Job 0: Map: 1 Cumulative CPU: 1.28 sec HDFS Read: 894 HDFS Write: 60 SUCCESS
Total MapReduce CPU Time Spent: 1 seconds 280 msec
OK
ename lowername
SMITH smith
ALLEN allen
WARD ward
JONES jones
MARTIN martin
Time taken: 10.548 seconds, Fetched: 5 row(s)
创建一个UDF-方式二:
此方法jar包要位于hdfs上;
CREATE FUNCTION myfunc AS 'myclass' USING JAR 'hdfs:///path/to/jar';
1、
##上传jar包到hdfs
hive (db_hive)> dfs -mkdir -p /user/root/hive/jars/;
hive (db_hive)> dfs -put /opt/datas/hiveudf.jar /user/root/hive/jars/;
hive (db_hive)> dfs -ls -R /user/root/hive/jars;
-rw-r--r-- 1 root supergroup 910 2019-04-24 15:40 /user/root/hive/jars/hiveudf.jar #创建function
hive (db_hive)> create function self_lower as 'com.beifeng.senior.hive.udf.LowerUDF' using jar 'hdfs://hadoop-senior.ibeifeng.com:8020/user/root/hive/jars/hiveudf.jar';
converting to local hdfs://hadoop-senior.ibeifeng.com:8020/user/root/hive/jars/hiveudf.jar
Added /tmp/5356b66f-bf56-4de6-abf8-30be8029fa8b_resources/hiveudf.jar to class path
Added resource: /tmp/5356b66f-bf56-4de6-abf8-30be8029fa8b_resources/hiveudf.jar
OK
Time taken: 0.025 seconds #使用
hive (db_hive)> select ename, self_lower(ename) lowername from emp limit 5;
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1554717689707_0032, Tracking URL = http://hadoop-senior.ibeifeng.com:8088/proxy/application_1554717689707_0032/
Kill Command = /opt/modules/hadoop-2.5.0/bin/hadoop job -kill job_1554717689707_0032
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2019-04-24 15:53:28,378 Stage-1 map = 0%, reduce = 0%
2019-04-24 15:53:33,504 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
MapReduce Total cumulative CPU time: 1 seconds 350 msec
Ended Job = job_1554717689707_0032
MapReduce Jobs Launched:
Job 0: Map: 1 Cumulative CPU: 1.35 sec HDFS Read: 894 HDFS Write: 60 SUCCESS
Total MapReduce CPU Time Spent: 1 seconds 350 msec
OK
ename lowername
SMITH smith
ALLEN allen
WARD ward
JONES jones
MARTIN martin
Time taken: 10.549 seconds, Fetched: 5 row(s)
2.13 Hive中自带Function使用及自定义UDF编程的更多相关文章
- Hive中实现group concat功能(不用udf)
在 Hive 中实现将一个字段的多条记录拼接成一个记录: hive> desc t; OK id string str string Time taken: 0.249 seconds hive ...
- Hive中的UDF详解
hive作为一个sql查询引擎,自带了一些基本的函数,比如count(计数),sum(求和),有时候这些基本函数满足不了我们的需求,这时候就要写hive hdf(user defined funati ...
- 切记ajax中要带上AntiForgeryToken防止CSRF攻击
在程序项目中经常看到ajax post数据到服务器没有加上防伪标记,导致CSRF被攻击,下面小编通过本篇文章给大家介绍ajax中要带上AntiForgeryToken防止CSRF攻击,感兴趣的朋友一起 ...
- hive中UDF、UDAF和UDTF使用
Hive进行UDF开发十分简单,此处所说UDF为Temporary的function,所以需要hive版本在0.4.0以上才可以. 一.背景:Hive是基于Hadoop中的MapReduce,提供HQ ...
- Hive中知识点
hive的最新学习资料:http://www.cnblogs.com/qingyunzong/p/8707885.html hive的参数设置大全:https://cwiki.apache.org/c ...
- 在hive中UDF和UDAF使用说明
Hive进行UDF开发十分简单,此处所说UDF为Temporary的function,所以需要hive版本在0.4.0以上才可以. 一.背景:Hive是基于Hadoop中的MapReduce,提供HQ ...
- 【转】hive中UDF、UDAF和UDTF使用
原博文出自于: http://blog.csdn.net/liuj2511981/article/details/8523084 感谢! Hive进行UDF开发十分简单,此处所说UDF为Tempora ...
- Hive 中的 UDF
LanguageManual UDF 一.分类 UDF:User defined function 用户定义函数 一进一出 UDAF:User defined aggregation function ...
- 如何在 Apache Hive 中解析 Json 数组
我们都知道,Hive 内部提供了大量的内置函数用于处理各种类型的需求,参见官方文档:Hive Operators and User-Defined Functions (UDFs).我们从这些内置的 ...
随机推荐
- LINUX创建用户的命令
LINUX创建用户的命令useradd -g test -d /home/test1 -s /etc/bash -m test1注解:-g 所属组 -d 家目录 -s 所用的SHELL 删除用户命令u ...
- C语言宏定义时#(井号)和##(双井号)作用
#的功能是将其后面的宏参数进行字符串化操作(Stringfication),简单说就是在对它所引用的宏变量 通过替换后在其左右各加上一个双引号. #define example(instr) prin ...
- HDU 5336 XYZ and Drops 2015 Multi-University Training Contest 4 1010
这题的题意是给你一幅图,图里面有水滴.每一个水滴都有质量,然后再给你一个起点,他会在一開始的时候向四周发射4个小水滴,假设小水滴撞上水滴,那么他们会融合,假设质量大于4了,那么就会爆炸,向四周射出质量 ...
- # Playables API(翻译)
The Playables API provides a way to create tools, effects or other gameplay mechanisms by organi ...
- 让EasyDarwin只支持RTP over TCP传输
我们经常需要EasyDarwin服务器支持公网流媒体传输,但很多时候,播放器默认都是通过RTP over UDP的形式在RTSP SETUP中请求,往往都以在内网接收不到UDP数据失败结束,那么我们如 ...
- c# vs2010 连接access数据库
第一次在博客园写博文,由于文采不怎么好,即使是自己很熟悉的东西,写起来也会感觉到不知从何讲起,我想写的多了就好了. 这篇文章主要是介绍怎么用c# 语言 vs2010连接access数据库的,连接字符串 ...
- Delphi快捷键大全
Delphi快捷键大全 在过程.函数.事件内部, SHIFT+CTRL+向上的方向键 可跳跃到相应的过程.函数.事件的定义.相反,在过程.函数.事件的定义处,SHIFT+CTRL+向下的方向键 可跳跃 ...
- svn提交异常file is scheduled for addition, but is missing
svn提交错误file is scheduled for addition, but is missing svn ci -m "" svn: E155010: Commit fa ...
- 怎样把word直接转换成ppt
- 区分:AndroidDriver, iOSDriver, AppiumDriver and Remote WebDriver
区分:AndroidDriver, iOSDriver, AppiumDriver and Remote WebDriver 原文地址:https://discuss.appium.io/t/what ...