Keras 激活函数求教!

回归参数预测:需预测多个参数、

relu激活函数

model = Sequential()
model.add(Dense(7, input_shape=(7,),activation='relu', kernel_initializer='random_normal', bias_initializer='random_normal'))
#model.add(Dropout(0.5))
model.add(Dense(units=7, activation='relu',kernel_regularizer=regularizers.l2(0.01),bias_regularizer=regularizers.l1(0.01),
activity_regularizer=regularizers.l1(0.015)))
# model.add(Dropout(0.6))
model.add(Dense(units=2))
model.compile(loss='mse', optimizer=Adam(lr=0.005), metrics=['mae', 'mse'])
model.summary()

=======

linear 激活函数

model = Sequential()
model.add(Dense(7, input_shape=(7,),activation='linear', kernel_initializer='random_normal', bias_initializer='random_normal'))
#model.add(Dropout(0.5))
model.add(Dense(units=7, activation='linear',kernel_regularizer=regularizers.l2(0.01),bias_regularizer=regularizers.l1(0.01),
activity_regularizer=regularizers.l1(0.015)))
# model.add(Dropout(0.6))
model.add(Dense(units=2))
model.compile(loss='mse', optimizer=Adam(lr=0.005), metrics=['mae', 'mse'])
model.summary()

这是不是能说明在我这个模型训练优化中 linear 对比 relu 更合适?

----------------------

https://stackoverflow.com/questions/51023739/multidimensional-regression-with-keras

在stackoverflow上看到一篇类似的问题、

Try changing the activation on your first two layers to

activation='relu'

and see if that improves things at all by introducing non-linearity. You're currently just performing a series of linear transformations, so you're not really leveraging the power of a neural net in any way. There are a lot of other reasons why things might not be working as well as you hope, but they are a bit beyond the scope of a stackoverflow answer. If you have a big enough dataset, then regularization would be a good first thing to start reading up on though.

评论 4
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值