waterdrop启动报错

我使用的是spark单机版和waterdrop的单机版,但是在启动的时候报错了:

[root@localhost waterdrop-1.3.8]# ./bin/start-waterdrop.sh --master local[4] --deploy-mode client --config ./config/streaming.conf

[INFO] spark conf: --conf "spark.streaming.batchDuration=5" --conf "spark.executor.memory=1g" --conf "spark.executor.cores=1" --conf "spark.executor.instances=2" --conf "spark.app.name=Waterdrop"
Warning: Ignoring non-spark config property: "spark.executor.memory=1g"
Warning: Ignoring non-spark config property: "spark.app.name=Waterdrop"
Warning: Ignoring non-spark config property: "spark.streaming.batchDuration=5"
Warning: Ignoring non-spark config property: "spark.executor.instances=2"
Warning: Ignoring non-spark config property: "spark.executor.cores=1"
[INFO] Loading config file: ./config/streaming.conf
[INFO] parsed config file: {
    "spark" : {
        "spark.streaming.batchDuration" : 5,
        "spark.app.name" : "Waterdrop",
        "spark.executor.instances" : 2,
        "spark.executor.cores" : 1,
        "spark.executor.memory" : "1g"
    },
    "input" : [
        {
            "rate" : 1,
            "plugin_name" : "fakestream",
            "content" : [
                "Hello World, InterestingLab"
            ]
        }
    ],
    "filter" : [
        {
            "delimiter" : ",",
            "fields" : [
                "msg",
                "name"
            ],
            "plugin_name" : "split"
        }
    ],
    "output" : [
        {
            "plugin_name" : "stdout"
        }
    ]
}

[INFO] loading SparkConf: 
	spark.app.name => Waterdrop
	spark.master => local[4]
	spark.executor.memory => 1g
	spark.executor.instances => 2
	spark.streaming.batchDuration => 5
	spark.executor.extraJavaOptions => 
	spark.jars => file:/opt/software/waterdrop-1.3.8/lib/Waterdrop-1.3.8-2.11.8.jar
	spark.executor.cores => 1
	spark.submit.deployMode => client
	spark.driver.extraJavaOptions => 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/08/08 19:09:38 INFO SparkContext: Running Spark version 2.1.1
19/08/08 19:09:39 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/08/08 19:09:40 WARN Utils: Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 192.168.12.129 instead (on interface eth2)
19/08/08 19:09:40 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/08/08 19:09:40 INFO SecurityManager: Changing view acls to: root
19/08/08 19:09:40 INFO SecurityManager: Changing modify acls to: root
19/08/08 19:09:40 INFO SecurityManager: Changing view acls groups to: 
19/08/08 19:09:40 INFO SecurityManager: Changing modify acls groups to: 
19/08/08 19:09:40 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
19/08/08 19:09:40 INFO Utils: Successfully started service 'sparkDriver' on port 46744.
19/08/08 19:09:40 INFO SparkEnv: Registering MapOutputTracker
19/08/08 19:09:40 INFO SparkEnv: Registering BlockManagerMaster
19/08/08 19:09:40 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/08/08 19:09:40 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/08/08 19:09:40 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-23e32ca0-03f4-4971-bcda-292115b0326a
19/08/08 19:09:40 INFO MemoryStore: MemoryStore started with capacity 413.9 MB
19/08/08 19:09:41 INFO SparkEnv: Registering OutputCommitCoordinator
19/08/08 19:09:41 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
19/08/08 19:09:41 INFO Utils: Successfully started service 'SparkUI' on port 4041.
19/08/08 19:09:41 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.12.129:4041
19/08/08 19:09:41 INFO SparkContext: Added JAR file:/opt/software/waterdrop-1.3.8/lib/Waterdrop-1.3.8-2.11.8.jar at spark://192.168.12.129:46744/jars/Waterdrop-1.3.8-2.11.8.jar with timestamp 1565316581520
19/08/08 19:09:41 INFO Executor: Starting executor ID driver on host localhost
19/08/08 19:09:41 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42540.
19/08/08 19:09:41 INFO NettyBlockTransferService: Server created on 192.168.12.129:42540
19/08/08 19:09:41 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/08/08 19:09:41 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.12.129, 42540, None)
19/08/08 19:09:41 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.12.129:42540 with 413.9 MB RAM, BlockManagerId(driver, 192.168.12.129, 42540, None)
19/08/08 19:09:41 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.12.129, 42540, None)
19/08/08 19:09:41 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.12.129, 42540, None)
19/08/08 19:09:42 INFO SharedState: Warehouse path is 'file:/opt/software/waterdrop-1.3.8/spark-warehouse'.
find and register UDFs & UDAFs
found and registered UDFs count[0], UDAFs count[0]
Exception in thread "main" scala.MatchError: java.awt.AWTError: Can't connect to X11 window server using 'localhost:11.0' as the value of the DISPLAY variable. (of class java.awt.AWTError)
	at io.github.interestinglab.waterdrop.Waterdrop$.main(Waterdrop.scala:38)
	at io.github.interestinglab.waterdrop.Waterdrop.main(Waterdrop.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
19/08/08 19:09:46 INFO SparkContext: Invoking stop() from shutdown hook
19/08/08 19:09:46 INFO SparkUI: Stopped Spark web UI at http://192.168.12.129:4041
19/08/08 19:09:46 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/08/08 19:09:46 INFO MemoryStore: MemoryStore cleared
19/08/08 19:09:46 INFO BlockManager: BlockManager stopped
19/08/08 19:09:46 INFO BlockManagerMaster: BlockManagerMaster stopped
19/08/08 19:09:46 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/08/08 19:09:46 INFO SparkContext: Successfully stopped SparkContext
19/08/08 19:09:46 INFO ShutdownHookManager: Shutdown hook called
19/08/08 19:09:46 INFO ShutdownHookManager: Deleting directory /tmp/spark-3c59c617-76f2-44fd-9ddc-c49ca808ea19

    这个bug其实不是waterdrop和spark引起的,因为xshell的缘故,应该是版本的问题,我用的xshell6,在Linux本机上启动一点问题都没有,本来自己平时肯定是用xshell进行启动,谁知道还有这个坑。

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值