spark-submit 提交任务时 报错:
Cannot create a table having a column whose name contains commas in Hive metastore. Table: `xxx`.`xxx`; Column: get_geohash(43.834269, 87.605044, 8);
解决办法:
spark-submit 提交任务时传入的sql 中查询语句中的字段写别名即可,如下:
sql="select get_geohash(43.834269, 87.605044, 8) as b ,....."
/opt/data/sbin/spark-submit --master yarn --deploy-mode cluster \
--queue root.default \
--name profile_confi_allfull_test \
--driver-memory 5G \
--executor-memory 10G \
--conf spark.network.timeout=1200s \
--class com.sparkcore.scala.SparkCal \
--jars hdfs://Hadoop/dmgroup/dba/commmon/udf/udf-manager-0.0.7-SNAPSHOT-jar-with-dependencies.jar \
/home/test2/dusw/spark_calculat-1.0.jar "$sql" "" 0