1.创建一个Maven工程Hive
2.导入依赖
<dependencies>
<!-- https://mvnrepository.com/artifact/org.apache.hive/hive-exec -->
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>1.2.1</version>
</dependency>
</dependencies>
3.创建一个类
package com.atguigu.hive;
import org.apache.hadoop.hive.ql.exec.UDF;
public class Lower extends UDF {
public String evaluate (final String s) {
if (s == null) {
return null;
}
return s.toLowerCase();
}
}
4.打成jar包上传到服务器/opt/module/jars/ hive_udf-0.0.1-SNAPSHOT.jar
5. 将jar包上传到hdfs的指定目录
hadoop fs -put hive_udf-0.0.1-SNAPSHOT.jar /
6. 创建函数
[hadoop103:21000] > create function mylower(string) returns string location '/hive_udf-0.0.1-SNAPSHOT.jar' symbol='com.atguigu.hive_udf.Hive_UDF';
7. 使用自定义函数
[hadoop103:21000] > select ename, mylower(ename) from emp;
8.通过show functions查看自定义的函数
[hadoop103:21000] > show functions;
Query: show functions
+-------------+-----------------+-------------+---------------+
| return type | signature | binary type | is persistent |
+-------------+-----------------+-------------+---------------+
| STRING | mylower(STRING) | JAVA | false |
+-------------+-----------------+-------------+---------------+