spark写arango_com.arangodb.velocypack.exception.VPackParserException: java.lang.InstantiationExceptio...

使用Scala尝试连接ArangoDB时遇到VPackParserException异常。已更新代码,现在能正确计数,但尝试打印文档时只显示nf(null)。寻求帮助。
摘要由CSDN通过智能技术生成

2017-01-11 16:10:02

0

i am trying to connect my ArangoDb database with scala, but when the connection is made, and i am doing the operation, i have an VPackParserException error. My code:

import com.arangodb.spark.{ArangoSpark, ReadOptions}

import org.apache.spark.SparkContext

import org.apache.spark.SparkConf

import scala.beans.BeanProperty

object ArangotoSpark {

def main(args: Array[String]) {

case class netflow(@BeanProperty SrcHost: String,

@BeanProperty DstHost: String,

@BeanProperty SrcPort: String,

@BeanProperty DstPort: String,

@BeanProperty Protocol: String,

@BeanProperty StartTS: String,

@BeanProperty EndTS: String,

@BeanProperty Packets: Int,

@BeanProperty Bytes: Int) { }

val conf = new SparkConf().setAppName("Simple Application").setMaster("local[*]")

.set("arangodb.host", "127.0.0.2")

.set("arangodb.port", "8529")

.set("arangodb.user", "root")

.set("arangodb.password", "rut")

.set("arangodb.database", "netflow")

val sc = new SparkContext(conf)

val rdd = ArangoSpark.load[netflow](sc, "N201701031130", ReadOptions("netflow"))

val rdd2 = rdd.filter { x => x.SrcHost.matches("15.33.165.30") }

rdd2.count()

}

}

Any help is appreciated. Thank you.

UPDATE: Now my code looks like this:

case class nf (@BeanProperty cadena: String){

def this() = this(cadena = null)}

val rdd = ArangoSpark.load[nf](sc, "N201701031130", ReadOptions("netflow"))

println(rdd.count())

println("*************************************")

rdd.collect.foreach(println(_))

rdd.count gives the correct number of documents, but when i try to print them, i only have nf(null) lines

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值