30.
Spark SQL case when用法:
https://sparkbyexamples.com/spark-case-when-otherwise-example/
how to write case with when condition in spark sql using scala - Stack Overflow
scala - SPARK SQL: Implement AND condition inside a CASE statement - Stack Overflow
上述几个帖子已经写得很详细了,我就不再赘述,只放出链接。
Spark Sql 使用Case when,存在多个when条件的代码示例:
SPARK SQL - case when then - Stack Overflow
伪代码:
select
CASE WHEN company_base = 'ah' THEN '安徽'
ELSE
CASE WHEN company_base = 'bj' THEN '北京'
ELSE
CASE WHEN company_base = 'cq' THEN '重庆'
ELSE
CASE WHEN company_base = 'fj' THEN '福建'
ELSE
CASE WHEN company_base = 'gd' THEN '广东'
ELSE
CASE WHEN company_base = 'gx' THEN '广西'
ELSE
CASE WHEN company_base = 'gs' THEN '甘肃'
ELSE
CASE WHEN company_base = 'gz' THEN '贵州'
ELSE
CASE WHEN company_base = 'han' THEN '海南'
ELSE
CASE WHEN company_base = 'heb' THEN '河北'
ELSE
CASE WHEN company_base = 'hen' THEN '河南'
ELSE
CASE WHEN company_base = 'hlj' THEN '黑龙江'
ELSE
CASE WHEN company_base = 'hub' THEN '湖北'
ELSE
CASE WHEN company_base = 'hun' THEN '湖南'
ELSE
CASE WHEN company_base = 'jl' THEN '吉林'
ELSE
CASE WHEN company_base = 'js' THEN '江苏'
ELSE
CASE WHEN company_base = 'jx' THEN '江西'
ELSE
CASE WHEN company_base = 'ln' THEN '辽宁'
ELSE
CASE WHEN company_base = 'nmg' THEN '内蒙古'
ELSE
CASE WHEN company_base = 'nx' THEN '宁夏'
ELSE
CASE WHEN company_base = 'sc' THEN '四川'
ELSE
CASE WHEN company_base = 'sd' THEN '山东'
ELSE
CASE WHEN company_base = 'sh' THEN '上海'
ELSE
CASE WHEN company_base = 'snx' THEN '陕西'
ELSE
CASE WHEN company_base = 'sx' THEN '山西'
ELSE
CASE WHEN company_base = 'tj' THEN '天津'
ELSE
CASE WHEN company_base = 'xj' THEN '新疆'
ELSE
CASE WHEN company_base = 'xz' THEN '西藏'
ELSE
CASE WHEN company_base = 'yn' THEN '云南'
ELSE
CASE WHEN company_base = 'zj' THEN '浙江'
ELSE
CASE WHEN company_base = 'qh' THEN '青海'
ELSE
CASE WHEN company_base = 'tw' THEN '台湾'
ELSE '未知'
END END END END END END END END END END END END END
END END END END END END END END END END END END END
END END END END END END
AS company_base
from company
group by company_name
此贴来自汇总贴的子问题,只是为了方便查询。
总贴请看置顶帖:
pyspark及Spark报错问题汇总及某些函数用法。