def sigmoid_rampup(current_epoch):
current = np.clip(current_epoch, 0.0, 15.0)
phase = 1.0 - current / 15.0
return np.exp(-5.0 * phase * phase).astype(np.float32)
if __name__ == '__main__':
x = np.linspace(1,20, num=20)
y = sigmoid_rampup(x)
plt.plot(x,y)
plt.show()
在SE-SSD 的代码中看到了 sigmoid_rampup 函数, 它用来调节 consistency_weight 在整个loss中的权重.
这个思想来自于 Antti Tarvainen and Harri Valpola. Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results. In NeurIPS, pages
1195–1204, 2017. 2, 6
Training details We adopt the ADAM optimizer and cosine annealing learning ra