mongo-spark-connector 解决 Mongo长精度 0.0引起的Bug Decimal scale (12) cannot be greater than precision (1).

mongo-spark-connector_2.11-2.1.2.jar


在使用mongo-spark-connector_2.11-2.1.0.jar 时,会由于Mongo长精度 0.0 导致 转化为 java 的BigDecimal 时的 异常检查 不用过。这个Bug在mongo-spark-connector_2.11-2.1.2 已经解决了,但是这个jar 中有使用了 java8的方法。这个jar 经过自己处理 已经可以在java7中使用了。
错误示例:

User class threw exception: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, njtest-cdh5-dn02.nj, executor 2): org.apache.spark.sql.AnalysisException: Decimal scale (12) cannot be greater than precision (1).;
at org.apache.spark.sql.types.DecimalType.<init>(DecimalType.scala:45)
at org.apache.spark.sql.types.DecimalType$.apply(DecimalType.scala:42)
at org.apache.spark.sql.types.DataTypes.createDecimalType(DataTypes.java:123)
at com.mongodb.spark.sql.MongoInferSchema$.com$mongodb$spark$sql$MongoInferSchema$$getDataType(MongoInferSchema.scala:248)
at com.mongodb.spark.sql.MongoInferSchema$$anonfun$com$mongodb$spark$sql$MongoInferSchema$$getSchemaFromDocument$1.apply(MongoInferSchema.scala:114)
at com.mongodb.spark.sql.MongoInferSchema$$anonfun$com$mongodb$spark$sql$MongoInferSchema$$getSchemaFromDocument$1.apply(MongoInferSchema.scala:114)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at com.mongodb.spark.sql.MongoInferSchema$.com$mongodb$spark$sql$MongoInferSchema$$getSchemaFromDocument(MongoInferSchema.scala:114)
at com.mongodb.spark.sql.MongoInferSchema$.com$mongodb$spark$sql$MongoInferSchema$$getDataType(MongoInferSchema.scala:231)
at com.mongodb.spark.sql.MongoInferSchema$$anonfun$getCompatibleArraySchema$1.apply(MongoInferSchema.scala:278)
at com.mongodb.spark.sql.MongoInferSchema$$anonfun$getCompatibleArraySchema$1.apply(MongoInferSchema.scala:274)
at scala.collection.IterableLike$class.takeWhile(IterableLike.scala:160)
at scala.collection.AbstractIterable.takeWhile(Iterable.scala:54)
at com.mongodb.spark.sql.MongoInferSchema$.getCompatibleArraySchema(MongoInferSchema.scala:274)
at com.mongodb.spark.sql.MongoInferSchema$.getSchemaFromArray(MongoInferSchema.scala:257)
at com.mongodb.spark.sql.MongoInferSchema$.com$mongodb$spark$sql$MongoInferSchema$$getDataType(MongoInferSchema.scala:227)
at com.mongodb.spark.sql.MongoInferSchema$$anonfun$com$mongodb$spark$sql$MongoInferSchema$$getSchemaFromDocument$1.apply(MongoInferSchema.scala:114)
at com.mongodb.spark.sql.MongoInferSchema$$anonfun$com$mongodb$spark$sql$MongoInferSchema$$getSchemaFromDocument$1.apply(MongoInferSchema.scala:114)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at com.mongodb.spark.sql.MongoInferSchema$.com$mongodb$spark$sql$MongoInferSchema$$getSchemaFromDocument(MongoInferSchema.scala:114)
at com.mongodb.spark.sql.MongoInferSchema$$anonfun$2.apply(MongoInferSchema.scala:78)
at com.mongodb.spark.sql.MongoInferSchema$$anonfun$2.apply(MongoInferSchema.scala:78)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1336)
at scala.collection.TraversableOnce$class.aggregate(TraversableOnce.scala:214)
at scala.collection.AbstractIterator.aggregate(Iterator.scala:1336)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$24.apply(RDD.scala:1135)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$24.apply(RDD.scala:1135)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$25.apply(RDD.scala:1136)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$25.apply(RDD.scala:1136)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Driver stacktrace:

File-download-link修改好的jar 已经上传,请自行下载!!!

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值