spark-shuffle-site.xml 配置项

在 NodeManager 启动 spark shuffle,可以配置 spark-shuffle-site.xml,设置 spark-shuffle 的参数。

spark-shuffle-site.xml 配置项

<?xml version="1.0"?>
<configuration>
  <property>
    <name>spark.shuffle.push.server.mergedShuffleFileManagerImpl</name>
    <value>org.apache.spark.network.shuffle.RemoteBlockPushResolver</value>
    <description>启动 push based shuffle 必须配置。默认为 org.apache.spark.network.shuffle.
NoOpMergedShuffleFileManager </description>
  </property>
  <property>
  	<name>spark.io.mode</name>
    <value>NIO</value>
    <description>默认 NIO</description>
  </property>
  <property>
  	<name>spark.io.preferDirectBufs</name>
    <value>true</value>
    <description>默认 true</description>
  </property>
  <property>
  	<name>spark.io.connectionTimeout</name>
    <value>120s</value>
    <description>默认为 ${spark.network.timeout}</description>
  </property>
  <property>
  	<name>spark.network.timeout</name>
    <value>120s</value>
    <description>connection idle timeout, 默认为120s, 超过事件的由 IdleStateHandler 进行关闭 </description>
  </property>
  <property>
  	<name>spark.io.connectionCreationTimeout</name>
    <value>120s</value>
    <description>默认 ${spark.io.connectionTimeout}</description>
  </property>
  <property>
  	<name>spark.io.backLog</name>
    <value>-1</value>
    <description>默认 -1, 代表用系统默认的</description>
  </property>
  <property>
  	<name>spark.io.numConnectionsPerPeer</name>
    <value>1</value>
    <description>默认 1</description>
  </property>
  <property>
  	<name>spark.io.serverThreads</name>
    <value>0</value>
    <description> 默认0 代表 2x#cores</description>
  </property>
  <property>
  	<name>spark.io.clientThreads</name>
    <value>0</value>
    <description>默认0 代表 2x#cores</description>
  </property>
  <property>
  	<name>spark.io.receiveBuffer</name>
    <value>-1</value>
    <description>默认 -1, 使用系统默认的</description>
  </property>
  <property>
  	<name>spark.io.sendBuffer</name>
    <value>-1</value>
    <description>默认 -1, 使用系统默认的</description>
  </property>
  <property>
  	<name>spark.sasl.timeout</name>
    <value>30s</value>
    <description>默认 30s</description>
  </property>
  <property>
  	<name>spark.io.maxRetries</name>
    <value>3</value>
    <description>默认 3</description>
  </property>
  <property>
  	<name>spark.io.retryWait</name>
    <value>5s</value>
    <description>默认5s</description>
  </property>
  <property>
  	<name>spark.io.lazyFD</name>
    <value>true</value>
    <description>默认true. Whether to initialize FileDescriptor lazily or not. If true, file descriptors are created only when data is going to be transferred. This can reduce the number of open files.</description>
  </property>
  <property>
  	<name>spark.io.enableVerboseMetrics</name>
    <value>false</value>
    <description>默认false.Whether to track Netty memory detailed metrics. If true, the detailed metrics of Netty PoolByteBufAllocator will be gotten, otherwise only general memory usage will be tracked.</description>
  </property>
  <property>
  	<name>spark.io.enableTcpKeepAlive</name>
    <value>false</value>
    <description>默认 false</description>
  </property>
</configuration>
  • 3
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值