ClassNotFoundException: Class org.apache.phoenix.mapreduce.PhoenixOutputFormat not found

本文解析了在Spark中使用PhoenixOutputFormat时遇到的ClassNotFoundException异常,详细展示了异常堆栈跟踪,并提供了解决方案,包括在Spark的classpath中添加phoenix-core-*.jar的具体步骤。
  • 问题
    ClassNotFoundException: Class org.apache.phoenix.mapreduce.PhoenixOutputFormat not found
  • 详细展示
Exception in thread "main" java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.phoenix.mapreduce.PhoenixOutputFormat not found
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2112)
    at org.apache.hadoop.mapreduce.task.JobContextImpl.getOutputFormatClass(JobContextImpl.java:232)
    at org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:971)
    at org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopFile(PairRDDFunctions.scala:903)
    at org.apache.phoenix.spark.ProductRDDFunctions.saveToPhoenix(ProductRDDFunctions.scala:51)
    at com.mypackage.save(DAOImpl.scala:41)
    at com.mypackage.ProtoStreamingJob.execute(ProtoStreamingJob.scala:58)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.mypackage.SparkApplication.sparkRun(SparkApplication.scala:95)
    at com.mypackage.SparkApplication$delayedInit$body.apply(SparkApplication.scala:112)
    at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
    at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
    at scala.App$$anonfun$main$1.apply(App.scala:71)
    at scala.App$$anonfun$main$1.apply(App.scala:71)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
    at scala.App$class.main(App.scala:71)
    at com.mypackage.SparkApplication.main(SparkApplication.scala:15)
    at com.mypackage.ProtoStreamingJobRunner.main(ProtoStreamingJob.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: Class org.apache.phoenix.mapreduce.PhoenixOutputFormat not found
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2018)
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2110)
    ... 30 more
  • 原因

    1. 生产环境出现此种问题:说明PhoenixOutputFormat类找不到,即phoenix-core-*.jar找不到、
    2. 此时环境出现此种问题:还是要考虑phoenix-core-*.jar的问题
  • 解决方案
    目前解决方案都是生产环境的:因为spark在运行程序的时候找不到该包,所以那我们就在spark的classpath添加上该文件,就可以了。
    一般spark的classpath在 /etc/spark/conf/classpath.txt下,里面有一堆不知道什么用途的包,剩下的就是把phoenix所依赖的jar路径copy过来就行(注意此处:集群每台机器上都要copy),如下:在这里插入图片描述
    注意:感觉此种方式比较low,但是目前还没有找到其他比较好的方式来解决这个问题,所以就只能先这么搞了。

更多大数据相关问题、或者互联网金融相关问题可以咨询我,免费解答,在这里插入图片描述

评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值