Spark SQL UDF示例

UDF即用户自定函数,注册之后,在sql语句中使用。

基于scala-sdk-2.10.7,Spark2.0.0。

package UDF_UDAF

import java.util

import org.apache.spark.sql.{RowFactory, SparkSession}
import org.apache.spark.SparkConf
import org.apache.spark.sql.api.java.UDF1
import org.apache.spark.sql.types.{DataTypes, StructField}

// 自定义一个继承自 UDF1(或UDF2,UDF3,UDF4...)的类
class UDF extends UDF1[String,Int]{
override def call(t1: String): Int = {
t1.length
}
} object UDF{
def main(args: Array[String]): Unit = {
val warehouseLocation = "/code/VersionTest/spark-warehouse" //必须是相对路径
val conf = new SparkConf().setMaster("local").setAppName("udf")
val sparkSession = SparkSession.builder()
.config(conf)
.config("spark.sql.warehouse.dir", warehouseLocation) //设置warehouse
.getOrCreate()
val sc = sparkSession.sparkContext val parallize = sc.parallelize(Array("zhangsan","lisi","wangwu"))
val rowRDD = parallize.map(s=>RowFactory.create(s)) val fields = new util.ArrayList[StructField]()
fields.add(DataTypes.createStructField("name",DataTypes.StringType,true))
val schema = DataTypes.createStructType(fields) val df = sqlSession.createDataFrame(rowRDD, schema) df.createOrReplaceTempView("user") sparkSession.udf.register("StrLen", new UDF(),DataTypes.IntegerType) sparkSession.sql("select name, StrLen(name) as length from user").show() sparkSession.stop()
}
}

结果

Spark SQL UDF示例

上一篇:vue 父子父组件通过props传父页面请求后的数据


下一篇:文档整体解决方案(readthedocs、github 、sphinx)使用