beeline 连接,参数报错 Cannot modify ** at runtime. It is in the list of parameters that can‘t be modified

问题描述

CDH 大数据集群,使用beeline 连接hiveserver2时,后面跟spark的一些参数时。报错Failed to open new session: java.lang.IllegalArgumentException: Cannot modify hive.spark.client.server.connect.timeout at runtime. It is in the list of parameters that can't be modified at runtime or is prefixed by a restricted variable,详情如下:

[wzqi@cdh1 shell]$ beeline -u 'jdbc:hive2://ha01.bg.hadoop:10000/default;principal=hive/ha01.bg.hadoop@HA.CDH' --hiveconf hive.merge.smallfiles.avgsize=16000000 --hiveconf hive.execution.engine=spark --hiveconf hive.server2.long.polling.timeout=50ms --hiveconf hive.mapjoin.smalltable.filesize=100000000 --hiveconf hive.merge.sparkfiles=true --hiveconf spark.executor.extraJavaOptions=-XX:+UseG1GC --hiveconf hive.map.aggr.hash.percentmemory=0.5 --hiveconf hive.auto.convert.join.noconditionaltask=true --hiveconf hive.compute.query.using.stats=true --hiveconf spark.task.maxFailures=5 --hiveconf hive.spark.job.monitor.timeout=180s --hiveconf mapreduce.job.reduce.slowstart.completedmaps=1.0 --hiveconf hive.optimize.index.filter=true --hiveconf hive.stats.autogather=true --hiveconf hive.optimize.reducededuplication.min.reducer=4 --hiveconf hive.merge.mapredfiles=false --hiveconf hive.auto.convert.join.noconditionaltask.size=500000000 --hiveconf hive.fetch.task.conversion.threshold=1073741824 --hiveconf hive.optimize.bucketmapjoin.sortedmerge=false --hiveconf hive.exec.reducers.bytes.per.reducer=268435456 --hiveconf hive.optimize.reducededuplication=true --hiveconf hive.spark.client.connect.timeout=5s --hiveconf mapreduce.job.queuename=root.000ywb.bdhywbas_hive --hiveconf hive.spark.client.server.connect.timeout=1800s --hiveconf hive.merge.size.per.task=256000000 --hiveconf hive.spark.client.rpc.max.size=1610612736 --hiveconf hive.auto.convert.join=true --hiveconf hive.map.aggr=true --hiveconf hive.support.concurrency=false --hiveconf hive.limit.pushdown.memory.usage=0.4 --hiveconf hive.optimize.ppd=true --hiveconf hive.fetch.task.conversion=more --hiveconf hive.smbjoin.cache.rows=10000 --hiveconf spark.yarn.max.executor.failures=31 --hiveconf hive.stats.fetch.column.stats=true --hiveconf hive.optimize.sort.dynamic.partition=false --hiveconf hive.merge.mapfiles=true --incremental=true
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.2.1-1.cdh6.2.1.p4236.6050411/jars/log4j-slf4j-impl-2.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.2.1-1.cdh6.2.1.p4236.6050411/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://ha01.bg.hadoop:10000/default;principal=hive/ha01.bg.hadoop@HA.CDH
21/05/11 15:07:46 [main]: WARN jdbc.HiveConnection: Failed to connect to ha01.bg.hadoop:10000
Error: Could not open client transport with JDBC Uri: jdbc:hive2://ha01.bg.hadoop:10000/default;principal=hive/ha01.bg.hadoop@HA.CDH: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify hive.spark.client.server.connect.timeout at runtime. It is in the list of parameters that can't be modified at runtime or is prefixed by a restricted variable (state=08S01,code=0)
Beeline version 2.1.1-cdh6.2.1 by Apache Hive
beeline> !q
[wzqi@cdh1 shell]$

 

解决之道

需要登陆CM页面,选择集群,选择hive,选择Configuration,

对属性 HiveServer2 Advanced Configuration Snippet (Safety Valve) for hive-site.xml

进行配置如下信息:

NAME:hive.conf.restricted.list

Value:hive.spark.client.channel.log.level,hive.spark.client.rpc.threads,hive.spark.client.secret.bits,hive.spark.client.rpc.server.address,hive.spark.client.rpc.server.port,hive.spark.client.rpc.sasl.mechanisms

 

PS:这个Value 只需要增加报错中出现的参数属性就可以。

 

  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值